The Four Traps of Predictive Analytics

Management consultant James Taylor explains how to avoid common mistakes of predictive analytics.

Reading Time: 4 min 

Topics

Competing With Data & Analytics

How does data inform business processes, offerings, and engagement with customers? This research looks at trends in the use of analytics, the evolution of analytics strategy, optimal team composition, and new opportunities for data-driven innovation.
More in this series

If the name James Taylor makes you think of “Fire and Rain,” Carly Simon and adult contemporary radio, you’re probably not into business analytics. On the other hand, if you are into business analytics, or more specifically predictive analytics, the name means something very, very different. The other James Taylor is British and the CEO of Decision Management Solutions, an analytics and management consultancy in Palo Alto, California — and, to the right audience, he’s a rock star.

Taylor was in Boston recently performing his “greatest flops” — a countdown of the things companies fail at when starting out to do predictive analytics (drumroll, please):

The First Trap: Magical Thinking

Taylor said companies see analytics as a kind of magic — plug in some data and reap profit windfalls. The truth is, companies must understand what they want before they go analyzing things helter skelter, especially when it comes to making predictions. He points out that there are really only four things businesses can use analytics to predict: risk, opportunity, fraud and demand.

Companies also can’t just build a model once and apply it everywhere. Each of the four areas will almost certainly need different models, and companies may find they need a different model for every question they ask, Taylor said.

The Second Trap: Starting at the Top

Organizations often try to start using predictive analytics at the top of the organization to gain buy-in, Taylor said. But top executives make the kind of decisions that don’t lend themselves to analytics, he argues. Predictive analytics works best on decisions that get made repeatedly, but top executives most often make strategic decisions, which, Taylor said, tend to be one-time situations. Other top-level decisions are often tactical, which are also relatively complex and hard to formalize.

But operational decisions, such as those in which companies choose a supplier or determine whether to extend credit, lend themselves well to predictive analytics. So companies need to recognize that predictive analytics works best for prompting decisions about operations, rather than initiating their use at the executive level.

Companies also need to frame their predictive analytics around actions. “Don’t look at how good a customer is. Look at, what action should I offer to a customer? Should I change suppliers?” Taylor said.

Companies that don’t understand the kinds of decisions they want to make will struggle to get a return on their use of predictive analytics.

The Third Trap: Building Cottages, Not Factories

Focusing on decisions can help companies avoid another path to failure: creating analytic models that don’t scale. Analytics specialists are no more connected to the business than technology specialists, Taylor said; focusing on decisions can help bridge that gap. Otherwise, analytics specialists are prone to create the equivalent of a cottage industry, where the models built apply to only one thing, or are too complex and expensive to be reused easily.

He cited Netflix’s famous challenge, where it gave $1 million for a better algorithm to make movie recommendations. Its million-dollar model “was never deployed,” Taylor said. “They got a fabulous model, but ask them, and they will tell you that the resources weren’t available to use it. What they meant to fund was ‘a model that was more predictive that we can realistically deploy and run on our service in Earth time.’ They didn’t ask for that.”

“I advise against spending $1 million on a model that’s too hard to use,” Taylor said. To be fair, a complex, hard-to-implement program was not the sole reason Netflix did not deploy the model.

Still, he said, companies often want to just pilot a couple of predictive algorithms, but don’t plan for them to succeed. If the pilots weren’t built with scaling up in mind, it often means there is no good way to “industrialize” the models so they can be used broadly in an organization. Companies should test their models, and then learn from them and adapt them, looking for ways to scale the impact of their decisions.

The Fourth Trap: Seeking Purified Data

“Garbage in, garbage out” is the cliché of data-haters everywhere. “It is not true that companies need good data to use predictive analytics,” Taylor said. “The techniques can be robust in the face of terrible data, because they were invented by people who had terrible data,” he noted.

For instance, Thomas Bayes, of Bayesian probability fame, lived in the 1700s. His techniques were developed despite thin, poor data sets. Taylor said in a follow-up conversation that he sees people in business who are “paralyzed about their data; they say it’s not very good, it’s not very clean, therefore I can’t do advanced analytics.”

Taylor says good data is useful, of course, but companies should start with the business decision they want to make, and then look for data that might help them predict outcomes. Remember that the needed data may come from outside corporate walls. For many either/or business cases, the data you need might not have to be pristine.

Avoiding the Traps, Step By Step

To put a positive spin on the Four Traps, here are the Four Steps companies need to build the predictive enterprise:

  1. Focus on the decisions you want to improve.
  2. Make analytics as broadly usable as you can.
  3. Start with models you can scale.
  4. Land and expand — Taylor’s slang for continuous improvement in the models: use them repeatedly, make them better, and expand into new areas of the business.

Topics

Competing With Data & Analytics

How does data inform business processes, offerings, and engagement with customers? This research looks at trends in the use of analytics, the evolution of analytics strategy, optimal team composition, and new opportunities for data-driven innovation.
More in this series

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comments (3)
HORACIO CARVAJAL
This article explains the main issues for a project in Analytics. But I would stress on the importance of Data. In my experience, data usually brings many unexpected findings which may result in a transformation of the project's goal/answer to respond, or it would simply require additional time for creating the right data set.
Henry Aguillon
From a basic logical thinking perspective First and Second traps as presented above have been way too common in my experiences. As a fallacy of equivocation a layman understanding is often presented to multiple problems to "what" should be doctrined as the predictive focus, and not only miscommunicated to higher levels but mis-scoped on capabilities and capacity, because of improper research of the objectives. What you get is just repetitive pitches on what stakeholders want to hear.

This in turn results in overreaching expectations and lack of results, and of course a waste of time and resources. 

I appreciate this article and great job exploring the tunnel vision in predictive analysis.
Praveen Kambhampati
Great article with simplified solution. Reflects the maturity and depth and expertise gained from Consulting assignments. Thanks for sharing.

The traps appear some similarities to ERP implementation assignments in manufacturing companies. Interestingly the IT enabled companies today train and assign their finance guys to predictive analysis and develop one silver bullet to undo tomorrow's wrong doings of investment. A desperate deployment for future proofing with a one time modelling.

As well clarified Predictive Analysis, is not a one time, task but an ongoing evolution of a decision support system. more realistic inputs would make the mirror more clean to a high definition reflection of an organizational status as it stands. The refinement and data inputs need more areas and importantly departments other than financial accounting and P&L, to have predictive enablement to support better decision making. 

Also organization paranoid about data security may find it difficult to get benefited from Predictive analysis. They may have to use the methods and modelling separately and multiple times to gain a consolidated view of the analysis performed.