University of Chicago professor Berkeley Dietvorst explains why we can’t let go of human judgment — to our own detriment.

Even when faced with evidence that an algorithm will deliver better results than human judgment, we consistently choose to follow our own minds.

Why?

MIT Sloan Management Review editor in chief Paul Michelman sat down with Berkeley Dietvorst, assistant professor of marketing at the University of Chicago Booth School of Business, to discuss a phenomenon Dietvorst has studied in great detail. (See “Related Research.”) What follows is an edited and condensed version of their conversation.

MIT Sloan Management Review: What prompted you to investigate people’s acceptance or lack thereof of algorithms in decision-making?

Dietvorst: When I was a Ph.D. student, some of my favorite papers were old works by [the late psychology scholar and behavioral decision research expert] Robyn Dawes showing that algorithms outperform human experts at making certain types of predictions. The algorithms that Dawes was using were very simple and oftentimes not even calibrated properly.

A lot of others followed up Dawes’s work and showed that algorithms beat humans in many domains — in fact, in most of the domains that have been tested. There’s all this empirical work showing algorithms are the best alternative, but people still aren’t using them.

So we have this disconnect between what the evidence says people should do and what people are doing, and no one was researching why.

What’s an example of these simple algorithms that were already proving to be superior?

Dietvorst: One of the areas was predicting student performance during an admission review. Dawes built a simple model: Take four or five variables — GPA, test scores, etc. — assign them equal weight, average them on a numerical scale, and use that result as your prediction of how students will rank against each other in actual performance. That model — which doesn’t even try to determine the relative value of the different variables — significantly outperforms admissions experts in predicting a student’s performance.

2 Comments On: When People Don’t Trust Algorithms

  • Stacy Shamberger | July 6, 2017

    I’ve seen so many examples, especially in business, where an Algorithm is not correct and when unpacked – shows a variety of weaknesses, some intentional, from calculations to the actual data. Also, I’ve witnessed instances where data in the algorithm is skewed to produce a spin on the data results.

    It’s not the algorithm that is questioned – but the data and the structure. Questioning data seems to become a part of human nature in this day and age – and rightly so, as data can be spun in so many ways.

    I love the innocence of academia around these types of things – it is refreshing and I hope it maintains its perceived “clean” standards. But what goes on inside the halls of our educational institutions may not be what is happening with data in the outside world

    Cheers!

  • Abhijit Bhattacharya | July 20, 2017

    It’s probably a human nature to believe in our ability to beat the odds. That is why many people still keep buying lottery tickets ( in fact, lottery is a good source of revenue for governments in many parts of the world), though algorithm-wise it never makes any sense.

Add a comment