Designing AI Systems That Customers Won’t Hate

Privacy concerns get most of the attention from tech skeptics, but powerful predictive algorithms can generate serious resistance by threatening consumer autonomy. Three safeguards can help.

Reading Time: 13 min 

Topics

Permissions and PDF

The nexus of big data analytics, machine learning, and AI may be the brightest spot in the global economy right now. McKinsey Global Research estimates that the use of AI will add as much as $13 trillion to global GDP by 2030.1 The noneconomic benefits to humankind will be equally dramatic, leading to a world that is safer (by reducing destructive human error) and offers people a better quality of life (by reducing the time they spend on tedious tasks such as driving and shopping). Even if the coming automation-driven disruption of labor markets is as serious as many fear, we are still, on balance, likely to be better off than today.

But not everyone is convinced. Negative predictions center on two overarching concerns that are related yet distinct. First, there is the issue of data privacy. After all, AI runs on data, and people are understandably uneasy about the things that automation technologies are learning about them, and how their private information might be used. Privacy in the digital age has been extensively researched and written about, and companies are devoting increasing attention to allaying their customers’ fears.2

However, there is another consideration that many companies have yet to seriously think about: autonomy. Though autonomous technology has a large and growing range of potential applications, when taken too far, it also may threaten users’ sense of autonomy and free will, or their belief that they can decide how to pursue their lives freely. A recent study found that when customers believed their future choices could be predicted based on their past choices, they chose less-preferred options. In other words, consumers violated their own preferences to reestablish their sense of autonomy by not choosing predictably.

The conflicting relationship between technology and free will is not new; it was described by Fyodor Dostoevsky’s Underground Man, the alienated protagonist of the 1864 novel Notes From Underground. Even in a utopian society, he argued, humans would rebel purely to prove “that men are still men and not the keys of a piano.”

Unfortunately, the perceived threat that people feel is likely to worsen as technological innovation accelerates and autonomous devices move into new areas of customers’ lives. We recently reached this conclusion after surveying diverse perspectives from philosophy, marketing, economics, and other fields.

Topics

References

1. J. Bughin, J. Seong, J. Manyika, et al., “Notes From the AI Frontier: Modeling the Impact of AI on the World Economy,” McKinsey Global Institute, September 2018.

2. C. Tucker, “Privacy, Algorithms, and Artificial Intelligence,” in “The Economics of Artificial Intelligence: An Agenda,” eds. A. Agrawal, J. Gans, and A. Goldfarb (Chicago: University of Chicago Press, 2019): 423-437.

3. Q. André, Z. Carmon, K. Wertenbroch, et al., “Consumer Choice and Autonomy in the Age of Artificial Intelligence and Big Data,” Consumer Needs and Solutions 5, no. 1-2 (March 2018): 28-37.

4. C. Longoni, A. Bonezzi, and C. Morwedge, “Resistance to Medical Artificial Intelligence,” Journal of Consumer Research 46, no. 4 (December 2019): 629-650.

5. C. Welch, “Google Just Gave a Stunning Demo of Assistant Making an Actual Phone Call,” The Verge, May 8, 2018, www.theverge.com.

6. S. Lim, S.M.J. van Ossalear, C. Fuchs, et al., “Made for You: The Effect of Consumer Identification on Consumer Preference,” in “NA - Advances in Consumer Research Volume 44,” eds. P. Moreau and S. Puntoni (Duluth, Minnesota: Association for Consumer Research, 2016): 118-122.

7. H. Abdulhalim, P. Kireyev, G. Tomaino, et al., “Explaining Algorithmic Decisions to Customers,” research in progress.

8. G. Mortimer, F. Mathmann, and L. Grimmer, “How the ‘Ikea Effect’ Subtly Influences How You Spend,” BBC, April 22, 2019, www.bbc.com.

9. R. Brown, “Social Psychology” (New York: The Free Press, 1965).

10. K. Wertenbroch, “Consumption Self-Control by Rational Purchase Quantities of Vice and Virtue,” Marketing Science 17, no. 4 (November 1998): 317-337.

11. L. Van Boven, P.J. Ehret, and D.K. Sherman, “Psychological Barriers to Bipartisan Public Support for Climate Policy.” Perspectives on Psychological Science 13, no. 4 (July 2018): 492-507.

12. J. Valentino-DeVries, N. Singer,M.H. Keller, et al., “Your Apps Know Where You Were Last Night, and They’re Not Keeping It Secret,” New York Times, Dec. 10, 2018, www.nytimes.com.

13. E. Pan, J. Ren, M. Lindorfer, et al., “Panoptispy: Characterizing Audio and Video Exfiltration From Android Applications,” Proceedings on Privacy Enhancing Technologies 2018, no. 4 (October 2018): 33-50.

14.“YouTube Fined $170 Million in US Over Children’s Privacy Violation,” BBC, Sept. 4, 2019, www.bbc.com.

15. G. Brockell, “Dear Tech Companies, I Don’t Want to See Pregnancy Ads After My Child Was Stillborn,” Washington Post, Dec. 12, 2018, www.washingtonpost.com.

16. T. Romm, “Apple’s Tim Cook Blasts Silicon Valley Over Privacy Issues,” Washington Post, Oct. 24, 2018, www.washingtonpost.com.

Reprint #:

61315

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.