In their forthcoming book, Kord Davis and Doug Patterson argue that there isn’t yet an ethical framework or common vocabulary for having productive discussions around the ethical use of big data.

In their forthcoming book, Kord Davis and Doug Patterson argue that there isn’t yet an ethical framework or common vocabulary for having productive discussions around the ethical use of big data.

Image courtesy of O'Reilly Media.

Ever wonder what Facebook does with all that data it collects from the 900 million of us that post to the social site, sometimes multiple times a day?

Facebook has a Data Science Team that is mining its stunningly large data trove to find societal insights, but insights to help its bottom line too. The Data Science Team, according to a recent article by Technology Review, has been able to determine a person’s relationship status based on the type of songs he or she likes (breakups tend to elicit more ballad Likes), or the mood of, say, an entire country (Chile’s gross national happiness flagged during a 2010 earthquake).

While these insights seem fairly innocuous, it is still early days in the conjoining of data science, social science and big data. What’s to come is already raising important ethical quandaries for some organizations.

Up until recently the prevailing wisdom has been that big data puts power squarely in the hands of consumers, who now have unprecedented insight to act upon. But not always. In June the Wall Street Journal reported that Orbitz utilized data analytics to determine that Mac users will pay higher prices for hotel rooms than will their PC-using counterparts. Which means if you’re searching Orbitz on your MacBook Pro, guess what? You’re getting steered toward higher priced “deals” than your PC-toting buddies.

Orbitz is not alone: organizations as diverse as Google News (and the way it filters the news) to landlords (who can sets rents by using revenue management software) have used analytics to manipulate customer segments, be that the news customers read or the lease agreements they sign.

Amidst all this data mining there is something else to consider: the creepiness factor.

According to the forthcoming book, Ethics of Big Data (O'Reilly Media, 2012), by former Capgemini principal consultant Kord Davis and doctor of philosophy Doug Patterson, as personal data becomes increasingly public, creators of big data will increasingly face ethical decision points.

“A lot of times technologists realize there is something particularly interesting that can be done with a new bit of correlating data — provide this new feature or service to customers. And product people get very excited about it; it has potential economic benefits,” says Davis in a webcast called “An Introduction to Ethics of Big Data.” “But then someone in the back of the room says, ‘yes, but that’s kind of creepy, right? Are you sure we should do that?’”

In this scenario individuals revert to their own moral code, according to Davis, since there is no common vocabulary or framework for the ethical use of big data.

Davis and Patterson's four question framework offers an important first step toward a common ground for discussing related issues:

  • Identity: Is offline existence identical to online existence? “Some think obviously yes, others no, but we want to be explicit and engage the questions in a collaborative fashion.”
  • Privacy: Who should control access to data? Davis points out that three data points can identify 87 percent of Americans: gender, birth date and zip code. “That means in any particular set of data, if I have one of three, I can correlate that data set with another, and I can identify you.”
  • Ownership: Who owns data, can we transfer the rights of it, and what are the obligations of people who generate and use that data? Davis points out that the World Economic Forum describes data as a new economic asset class that can be traded, sold and basically treated as a currency.
  • Reputation: What is important about reputation, says Davis, is the realization that the number of digital conversations and interactions that take place, and that we can participate in, fragments our ability to manage reputation. “Understanding the implications of that are going to be very important.”

Aligning actions to insights around this framework can help guide organizations to make ethical data decisions, says Davis. “The idea is to reduce your value conflicts when you’re facing opportunities to innovate using big data. To create a comfort factor — to reduce creepy — make sure your communal values are in line with your actions.”

3 Comments On: Ethical Quandary in the Age of Big Data: Avoid Creepiness

  • What Peter Drucker Would Be Reading | The Drucker Exchange | July 10, 2012

    […] 2.     Ethical Quandary in the Age of Big Data: Avoid Creepiness: Did you know that if you look for a hotel room on a Mac versus a PC, you have a good chance of being quoted a higher rate? That’s because we finding ways to learn more and more and more about ourselves and one another. Renee Boucher Ferguson writes at the Improvisations Blog at Sloan Management Reviewthat all of this big data is causing many ethical quandaries for companies, and we customers might find it all pretty creepy. Former Capgemini principal consultant Kord Davis has this advice to offer: “To create a comfort factor—to reduce creepy—make sure your communal values are in line with your actions.” […]

  • rosariotoday | July 17, 2012

    Should the ethical discussion and policies be ahead of the implementation of big data initiatives? I believe so.

  • Data Science Ethics | Introduction to Data Science, Columbia University | November 25, 2013

    […] out a code of conduct for data scientists. There are literally entire books on this topic and numerous articles discussing potential ethics concerns and values that data scientists should have. For […]

Add a comment