The Moral Limits of Data: What Won’t You Do?

[vc_row][vc_column][vc_column_text]Map courtesy of Sickweather

Map courtesy of Sickweather

Ten years ago, I was part of a team that deployed some cutting edge call center routing technology (Cisco ICM) that could use the phone number a customer was calling from, match it with data about them and then route the caller to a prioritized queue based on what we knew.

This was a hot trend because a company could then provide enhanced support for their preferred customers. For example, a “Gold Card” member would wait less time for an agent than a lower value customer would.

Fast forward a few years, and you can now take the data you collect on a customer customer, append it with other data sources, and then hire some quants to create psychological and personality profiles of each person. Now when a customer calls in, you can route them to a specific agent that is trained on what language patterns to use that will most effectively influence that caller’s behavior to your desired outcome.

This begs the moral question: are you providing better customer service or rather deploying thinly masked manipulation technics?

The rise of big data and the mainstreaming of predictive analytics have created near limitless possibilities to drive new revenue streams for companies and provide more ‘contextual’ customer experiences.

With such power, the question is no longer: “What will you do with data?” but rather “What won’t you do with data?”

In June of 2014, Facebook hit the headlines again when details emerged about experiments they performed in manipulating their users’ emotions. Many thought it was an outlandish premise, but Facebook kept right on moving.

Emotional and behavioral manipulation isn’t the only moral concern either. Researchers and startups alike are appending social media data to other information sources to predict when individual people are sick with everything from the common cold to HIV. Ponder, for a moment, the implications…

Big data, IoT and analytics present CEO’s with a mass of temptations they have never had to grapple with before. And, the lines between “data driven features” and “spying” are constantly being blurred. With a horizon of such collisions, how will you define the line between privacy and profits? What trust will you build or violate with your customers?

The values of your company and employees shape what practices will be found appropriate and which ones won’t. Your definition of being a “privacy minded company” is likely very different than that of your customers. How will you reconcile these values?

The risk of not doing so could have far reaching consequences to your business. Its probably best to define your boundaries and moral limits before the headlines do it for you.[/vc_column_text][/vc_column][/vc_row]

Sign-up for Updates
Like data and startups? This is your place.

Please note: I reserve the right to delete comments that are offensive or off-topic.

Leave a Reply

Your email address will not be published.