The Four Traps of Predictive Analytics
Management consultant James Taylor explains how to avoid common mistakes of predictive analytics.
Topics
Competing With Data & Analytics
If the name James Taylor makes you think of “Fire and Rain,” Carly Simon and adult contemporary radio, you’re probably not into business analytics. On the other hand, if you are into business analytics, or more specifically predictive analytics, the name means something very, very different. The other James Taylor is British and the CEO of Decision Management Solutions, an analytics and management consultancy in Palo Alto, California — and, to the right audience, he’s a rock star.
Taylor was in Boston recently performing his “greatest flops” — a countdown of the things companies fail at when starting out to do predictive analytics (drumroll, please):
The First Trap: Magical Thinking
Taylor said companies see analytics as a kind of magic — plug in some data and reap profit windfalls. The truth is, companies must understand what they want before they go analyzing things helter skelter, especially when it comes to making predictions. He points out that there are really only four things businesses can use analytics to predict: risk, opportunity, fraud and demand.
Companies also can’t just build a model once and apply it everywhere. Each of the four areas will almost certainly need different models, and companies may find they need a different model for every question they ask, Taylor said.
The Second Trap: Starting at the Top
Organizations often try to start using predictive analytics at the top of the organization to gain buy-in, Taylor said. But top executives make the kind of decisions that don’t lend themselves to analytics, he argues. Predictive analytics works best on decisions that get made repeatedly, but top executives most often make strategic decisions, which, Taylor said, tend to be one-time situations. Other top-level decisions are often tactical, which are also relatively complex and hard to formalize.
But operational decisions, such as those in which companies choose a supplier or determine whether to extend credit, lend themselves well to predictive analytics. So companies need to recognize that predictive analytics works best for prompting decisions about operations, rather than initiating their use at the executive level.
Companies also need to frame their predictive analytics around actions. “Don’t look at how good a customer is. Look at, what action should I offer to a customer? Should I change suppliers?” Taylor said.
Companies that don’t understand the kinds of decisions they want to make will struggle to get a return on their use of predictive analytics.
The Third Trap: Building Cottages, Not Factories
Focusing on decisions can help companies avoid another path to failure: creating analytic models that don’t scale. Analytics specialists are no more connected to the business than technology specialists, Taylor said; focusing on decisions can help bridge that gap. Otherwise, analytics specialists are prone to create the equivalent of a cottage industry, where the models built apply to only one thing, or are too complex and expensive to be reused easily.
He cited Netflix’s famous challenge, where it gave $1 million for a better algorithm to make movie recommendations. Its million-dollar model “was never deployed,” Taylor said. “They got a fabulous model, but ask them, and they will tell you that the resources weren’t available to use it. What they meant to fund was ‘a model that was more predictive that we can realistically deploy and run on our service in Earth time.’ They didn’t ask for that.”
“I advise against spending $1 million on a model that’s too hard to use,” Taylor said. To be fair, a complex, hard-to-implement program was not the sole reason Netflix did not deploy the model.
Still, he said, companies often want to just pilot a couple of predictive algorithms, but don’t plan for them to succeed. If the pilots weren’t built with scaling up in mind, it often means there is no good way to “industrialize” the models so they can be used broadly in an organization. Companies should test their models, and then learn from them and adapt them, looking for ways to scale the impact of their decisions.
The Fourth Trap: Seeking Purified Data
“Garbage in, garbage out” is the cliché of data-haters everywhere. “It is not true that companies need good data to use predictive analytics,” Taylor said. “The techniques can be robust in the face of terrible data, because they were invented by people who had terrible data,” he noted.
For instance, Thomas Bayes, of Bayesian probability fame, lived in the 1700s. His techniques were developed despite thin, poor data sets. Taylor said in a follow-up conversation that he sees people in business who are “paralyzed about their data; they say it’s not very good, it’s not very clean, therefore I can’t do advanced analytics.”
Taylor says good data is useful, of course, but companies should start with the business decision they want to make, and then look for data that might help them predict outcomes. Remember that the needed data may come from outside corporate walls. For many either/or business cases, the data you need might not have to be pristine.
Avoiding the Traps, Step By Step
To put a positive spin on the Four Traps, here are the Four Steps companies need to build the predictive enterprise:
- Focus on the decisions you want to improve.
- Make analytics as broadly usable as you can.
- Start with models you can scale.
- Land and expand — Taylor’s slang for continuous improvement in the models: use them repeatedly, make them better, and expand into new areas of the business.
Comments (3)
HORACIO CARVAJAL
Henry Aguillon
Praveen Kambhampati