What to Read Next
Back in 1633, Italian mathematician, physicist and astronomer Galileo was sentenced, as part of the Roman Inquisition, to lifelong house arrest. His crime? Suggesting that the sun, and not the earth, was the center of the universe.
Fast forward nearly 400 years and scientific opinions are again embroiled in controversy.
In October, a group of seven Italian scientists were convicted of manslaughter charges for not adequately communicating the potential threat of something they couldn’t scientifically predict: the deadly 2009 L’Áquila, Italy, earthquake.
The fact that this group of highly regarded scientists has been sentenced to prison for providing what ABC news called “inaccurate, incomplete and contradictory information” — for giving people a false sense of security — raises the question: Could there be implications for organizations that utilize data science to guide decision making?
In other words, what happens when a strategy goes awry, a product flops or investors lose money? Can data analysts be held accountable?
Data mining newsletter KDNuggets asked its readers the question: Should data scientists/data miners be responsible for their predictions? Its results:
- 45% said no, data scientists/data miners should not be held responsible for their predictions;
- 37% said data scientists/data miners can be held financially responsible, if they also benefit from correct predictions;
- 5% said data scientists/data miners can be held criminally responsible for wrong predictions; and
- 13% weren’t sure which direction the wind was blowing
Experts that I spoke with agree on two points. First, that basing a decision on data is no guarantee that a strategy is going to work, though it will typically mean fewer failures. And second, that basing a decision on data, where data is available, beats intuition alone.
Andrew McAfee, principal research scientist at MIT’s Center for Digital Business, points out that analytics-based decision makers shouldn’t be called out for a bad decision any more than their counterparts who do not utilize analytics. At the same time, says McAfee, “as more data becomes available, decision making should become more rigorous. If it does, there will be fewer bad ones.”
“A scientific approach is not a guarantee of success,” he continued. “But it is demonstrably better, in every sense of the word that we care about, than relying on human experts and human intuition.”
Anthony Goldbloom, CEO of Kaggle, uses a poker analogy to explain his thoughts on data scientist culpability. Kaggle hosts big data competitions in which thousands of data scientists from around the globe compete to build the best algorithm — everything from predicting customer churn to predicting molecular activity.
“You can play a very good hand and lose or you could play a very bad hand and win,” says Goldbloom. “Over the long run, if you’re a good poker player and you play clever poker, you’ll win more often than you’ll lose. But you are going to lose some hands. I think the analogy applies to data-driven decision-making, as well. If you’re basing your decision on hard data rather than intuition and gut feel, you should win and make good decisions more often.”
But here’s the other thing to keep in mind: A small team of data scientists can mean big bucks for an organization — win or lose.
“Because there’s so much leverage in some of these algorithms, the difference between a great data scientist working on an algorithm and a poor data scientist working on an algorithm can be in the order of hundreds of millions of dollars in ROI,” says Goldbloom. “So a great data scientist with a data set and an important problem can generate enormous value, whereas a poor data scientist with a data set trying to build a predictive modeling algorithm can cost a lot of money.”
As dollar amounts tied to analytics get bigger and the stakes become higher, will judgment revert to the middle ages?