New applications of big data and digital technology are a form of “market manipulation,” contends new research from the University of Washington School of Law.
Social networking and digital advertising are colliding at a dizzying rate.
Facebook, which has over 1 billion users (and which bought Instagram last year) introduced news feed and mobile ads in 2012. This year, it’s launching video ads. And it’s forecasted that Facebook will triple its mobile ad share this year, putting it second only to Google, still the “big kahuna” in the digital ad space.
Twitter, which has more than 200 million users, just bought MoPub, a digital advertising platform that essentially creates an ad space that is sold and delivered in milliseconds, every time a user views a page.
What does this all mean for the relationship between businesses and consumers? The short answer: the looming “sea change in the way companies use data to persuade” creates an ever-increasing opportunity to really exploit users, according to new research out of the University of Washington School of Law.
In his paper entitled “Digital Market Manipulation”, professor M. Ryan Calo (who is also an affiliate scholar at The Center for Internet and Society at Stanford Law School) suggests that the concept of market manipulation, first floated by researchers in 1999, is outmoded. Calo updates that framework to include the realities of a marketplace that is “mediated by technology” — the laptops, tablets, smartphones and other devices consumers use to get online.
According to the study, advertisers collect data about consumers and, increasingly, use that data to personalize every aspect of their users’ experience. They not only can take advantage of a general understanding of cognitive behavior — and limitations — but can “uncover and trigger consumer frailty at an individual level.”
For the record, Calo and other researchers define market manipulation as essentially “nudging for profit.” They believe that companies will use what they know about human psychology to “set prices, draft contracts, minimize perceptions of danger or risk, and otherwise attempt to extract as much rent as possible from their consumers.”
Calo lays out three phenomena, all inextricably tied to data, that can increase the potential for market manipulation:
- The Mass Production of Bias: “Increasingly, firms are turning to big data to help them monetize the enormous volume of information their business collect, generate, or buy. And one of the datasets to which firms have access is consumer behavior.
- Disclosure Ratcheting: There are many ways in which consumers will reveal more information than they meant to, says Calo. The danger comes in company’s ability to massively apply testing to reveal consumer’s biases and beliefs, and then use that information to gather more information, and ultimately manipulate based on a detected bias.
- Means-based Targeting: Behavioral targeting is nothing new in digital advertising. Calo points out that companies are even moving into the offline world to, for example, track consumers through their mobile devices as they move through a mall — and target them accordingly. What’s different is that digital advertising now is about relevance: matching the right advertising pitch with the right person, based on the premise that people vary in their susceptibility to various forms of persuasion.
“The trouble comes when firms start looking for vulnerability. Emerging methods of big data presents a new, vastly more efficient way to surface cognitive bias by trying to surface profitable anomalies… Note again that the algorithm that identifies the bias need not have to yield a theory as to why it is happening to be useful… The firm will not care, as long as the firm can exploit the deviation for its benefit.”
“A large company, meanwhile, is not limited by the forms and formalities of laboratories. Companies operate at scale. Companies can — through A/B testing — experiment on thousands of consumers at once… it could spell even greater data promiscuity than what we see today.”
“A firm with the resources and inclination will be in a position to surface and exploit how consumers tend to deviate from rational decision making on a previously unimaginable scale. Firms will increasingly be in the position to create suckers, rather than waiting for one to be born every minute.”
The question is, when does personalization become an issue of consumer protection? When there is economic and privacy harm, says Calo.
“This project stems from a basic observation: when a firm has both detailed information about consumers, and the ability to design every aspect of the interaction, there is always the potential for abuse,” said Calo in an email. “We need to think about realigning the incentives of companies and consumers around data so that digital marketing does not evolve in unfortunate directions.”
His new theory of digital market manipulation pushes the limits of consumer protection law, says Calo. It also advances economic and privacy harms that regulators will find themselves hard-pressed to ignore.