Consumers want to know when they’re being watched, and have changing expectations about privacy and data mining.
A recent article in PC Advisor, “Equifax Eyes Are Watching You — Big Data Means Big Brother,” discusses the staggering amount of data credit reporting giant Equifax gathers on individuals.
Equifax collects details on 500 million consumers and 81 million businesses in 17 countries. Included are magazine subscriptions, rental history, real estate assets, investment wealth, retail purchasing habits, criminal records, debt-to-income ratios, DMV files, post office boxes, and more. The company slices, dices, analyzes and indexes 800 billion records into 26 petabytes of data, according to PC Advisor.
While impressive, the sheer amount of information Equifax and other big data purveyors collect and utilize leads to the question: How much data is too much data to mine?
In other words, where is the line in the sand with regard to consumer privacy that organizations need to be aware of when pursuing big data analytics projects?
Equifax, financials services companies and healthcare organizations all fit into a unique category when it comes to data mining: they’re regulated. A patchwork of existing laws including The Gramm-Leach-Bliley Act, the Fair Credit Reporting Act and HIPPA govern how these industries use, share and sell information.
But if your business is outside of the realm of regulation, it’s the Wild West. Anything goes. Because data collected and disseminated for marketing purposes is essentially unregulated.
One thing that’s clear is that old methods of anonymizing data are no longer adequate. According to a recent Stanford Law Review post, “Privacy in the Age of Big Data: A Time For Big Decisions,” anonymized data can, in fact, be re-identified and attributed to individuals. “The implications for government and businesses can be stark, given that de-identification has become a key component of numerous business models, most notably in the contexts of health data (regarding clinical trials, for example), online behavioral advertising, and cloud computing,” it notes.
So what is a big data executive to do? Boris Segalis, a partner with Information Law Group LLP, counsels corporations on data confidentiality, privacy, security and management issues, particularly with regard to big data and advertising. He told me that there are three “buckets” to consider when making data privacy decisions:
- Existing laws that apply to personal information. This is fairly straightforward. Industry regulations are generally known, and if not, corporate council can lend a hand to find out what laws govern how information can be used.
- Guidance from the FTC, including the agency’s privacy enforcement precedent and the final report on protecting consumer privacy the FTC issued in March (available as a 112-page pdf). The report provides recommendations with regard to data sharing.
- Consumer expectations. This is where things get dicey. There is always the potential for consumer backlash when those expectations are thwarted.
A good example of consumer expectations clashing with data mining, according to Segalis, is the New York Times Sunday Magazine article by Charles Duhigg last February that detailed, among other things, Target’s big data slice and dice methods aimed at marketing to pregnant women. Target’s real world fallout: an expectant teenage mother was outed to her family by Target advertisements for cribs and baby clothes. Public opinion — including 570 comments and counting on the Times website — has not been fully supportive of Target’s big data promotional methods.
Segalis’s advice? He tells clients to first take the law into account. Consider, too, contractual requirements that are attached to how data is acquired — privacy notices, partnership agreements — and comply with that guidance. Then, he says, “do a gut check of consumer expectations.”