The Secrets to Managing Business Analytics Projects

Business analytics projects are often characterized by uncertain or changing requirements — and a high implementation risk. So it takes a special breed of project manager to execute and deliver them.

Reading Time: 15 min 

Topics

Permissions and PDF Download

Smart use of information technology can allow for frequent and faster iterations between the design and operating environments, improving experimentation efficiency.

Image courtesy of Flickr user BotheredByBees.

Managers have used business analytics to inform their decision making for years. Numerous studies have pointed to its growing importance, not only in analyzing past performance but also in identifying opportunities to improve future performance.1 As business environments become more complex and competitive, managers need to be able to detect or, even better, predict trends and respond to them early.2 Companies are giving business analytics increasingly high priority in hopes of gaining an edge on their competitors. Few companies would yet qualify as being what management innovation and strategy expert Thomas H. Davenport has dubbed “analytic competitors,” but more and more businesses are moving in that direction.3

Against this backdrop, we set out to examine what characterizes the most experienced project managers involved in business analytics projects. Which best practices do they employ, and how would they advise their less experienced peers? Our goal was to fill in gaps in management’s understanding of how project managers involved in analytics projects can contribute to the new intelligent enterprise. (See “About the Research.”) We found that project managers’ most important qualities can be sorted into five areas: (1) having a delivery orientation and a bias toward execution, (2) seeing value in use and value of learning, (3) working to gain commitment, (4) relying on intelligent experimentation and (5) promoting smart use of information technology.

About the Research »

1. Having a Delivery Orientation and a Bias Toward Execution

As a starting point, it’s important to understand what makes experienced business analytics project managers tick. The vast majority of our interviewees do not consider themselves different from other project managers. Like other focused project managers, they want to deliver their projects on time and on budget, and they have a strong delivery orientation.

But unlike many traditional project managers, they do not have a plan bias. Instead, they have a strong bias toward execution. (See “Learning From Experience.”) Although our interviewees don’t question the importance of initial planning, their focus is on project execution and delivery as opposed to adherence to the plan. In fact, they start with the assumption that the initial plan will have to change as the project progresses. This is what we mean by “a bias toward execution.”

Learning from Experience »

Why do analytics project managers have this execution bias? Many say it is because of the inherent complexity of the projects themselves, and they cite three reasons. First, analytics projects are typically characterized by uncertain or changing requirements. Project sponsors and users will often have a vision of what they seek to accomplish with analytics — for example, to improve direct marketing response, reduce inventory or increase service quality and customer satisfaction while controlling costs. But how they will achieve those goals is often unclear and involves further exploration.

The Leading Question

How do experienced business analytics project managers approach their projects?

Findings
  • They start with the assumption that the initial plan will have to change as the project progresses.
  • They enable a process of engaging stakeholders, explaining and managing expectations.
  • They rely on intelligent experimentation.

Second, the technology or models for meeting the uncertain requirements are often not known; they may be new to the team, or they may not even exist. This adds to the exploratory nature of analytics projects. Third, users of business analytics applications expect responsiveness, so the applications, by nature, should be highly responsive to user interaction. The challenge, then, is to find a balance between responsiveness and robustness.

Traditional project management methods tend to focus primarily on planning or a priori risk management (as opposed to managing and mitigating risk during execution). However, the uncertainty associated with analytics projects calls for a different approach.4 A growing body of literature on project management emphasizes the importance of adapting management and processes to the project characteristics. So while there may be a set of general-purpose tools for managing projects, different projects call for different managerial approaches. On the one hand, production-oriented and specifications-based approaches emphasize detailed early planning and requirements specification with minimal ongoing change and exploration. On the other, experimentation-based approaches emphasize less-specific early planning, good-enough requirements, and experimental and evolutionary design with significant ongoing learning and change.5 The latter, more adaptive approach, interviewees say, is better suited to analytics projects.

2. Seeing Value in Use and Value of Learning

There is increasing awareness in the project management community that sticking to the original plan does not necessarily provide value. Instead, value comes from a focus on execution and delivery. Experienced analytics project managers say they approach return on investment as a process rather than as a control metric. By focusing on execution, they seek to add value throughout the project’s life cycle, not just at the end of the project.

Our interviewees are guided by the concept of “value in use,” which measures value in terms of how a given asset provides benefits to a specific owner under a specific use.6 The idea is that the assets themselves have no inherent value; they generate value only when they offer specific benefits to their owners or users (for example, by allowing them to do their work differently). Consequently, only when an analytical model or application is actually used can its real benefits (and costs) be identified.

We found that project managers involved in analytics projects usually want to assess the value of the project quickly and accurately. Interviewees explained how they try to capture value both early in a project and throughout (for example, by using iterative feature-based delivery or rapid prototyping). Indeed, capturing value early and often can significantly improve a project’s ROI. For the assessment of value to be accurate, it needs to be carried out with a certain degree of rigor — which, as we have noted and will discuss later, is what our interviewees do.

Many project managers have learned through experience that they can’t expect to be right the first time. A bias toward execution is essential, interviewees report, because it is better to attempt to execute good ideas quickly than to attempt to impose the “perfect” plan. This implies that the focus is not on explaining discrepancies between the plan and actual results but on learning something new in the course of implementation that might justify altering the plan. Similarly, the iterative, incremental delivery described by interviewees assumes that each iteration provides learning inputs.

As a result, project benefits can be expressed in terms of “value in use” and “value of learning” that accrue during the project. Many analytics project managers have adopted project management approaches that tie in with the project management methods that are being developed to support highly complex projects. Adaptive methods assume there is a need to gather information and learn as you go along. These methods typically emphasize rapid delivery of prototypes and require that those involved be allowed to experiment during the project.7

The success of an analytics project is a function of the user’s acceptance of the model or the application. Our data make a convincing case for the value of continuous exposure to user feedback. As a project manager at a European financial services group explained, “Ideally, analysts and users are physically in the same room, or in close proximity.” The design environment and the operating environment should be closely linked, with the analytics project managers facilitating continuous interaction between them.

3. Working to Gain Commitment

Experienced project managers are unequivocal about the importance of engaging business users and other stakeholders as much as possible, as opposed to merely informing them after the fact. As an enterprise-business-intelligence architect at an international transport solutions provider put it: “We don’t want to develop a model just like that. If the business processes aren’t aligned with the model, or if the business doesn’t understand the definitions used in the model, then it simply won’t be used.”

The importance of explaining or clarifying the thinking behind a decision — or, in this case, the analytical model or application — cannot be overestimated. Indeed, interviewees say that one of the major risks of analytics projects is that the decision makers won’t be savvy enough to understand the analysis or the model’s underlying assumptions, and they will try to apply it where it isn’t applicable. Explanation is also crucial in gaining trust, as one project manager at a financial institution notes: “Gut feeling and intuition still take precedence over analytics. No matter how transparent analytical models are, they are inevitably statistically complex. That’s why users find it difficult to put their faith in quantitative data and methods.” And this is why analytics project managers should be pedagogical experts and help open up the black box of analytic models.

Ultimately, our interviewees agree that expectation management should not be overlooked. Setting the right expectations at the beginning (for example, regarding the quality of the data and the applicability of the models) and managing them as the project progresses increases both acceptance and the chances that the project will be successful.

The process described above bears a strong resemblance to what W. Chan Kim and Renée Mauborgne, recognized thought-leaders and authorities on business strategy, innovation and wealth creation, have described as “fair process.”8 Process fairness has proven its worth in diverse management contexts as a way to gain stakeholder commitment to decisions and change.

4. Relying on Intelligent Experimentation

A key element that emerges from the interview data is the importance of experimentation. Many of the analytics project managers we spoke to consider experimentation fundamental to the learning process. This is consistent with leading research by Harvard University professor and innovation management
authority Stefan H. Thomke, who defines experimentation as a fundamental innovation-process activity, consisting of iterative trial and error and directed by insight.9 The execution of an experiment, then, follows a four-step cycle: design, build, run and analyze.

The quality of the experimentation process has a strong bearing on the extent to which the project succeeds. Interviewees tend to be strong advocates of “good experimentation,” which is consistent with the scientific method. Well-designed experiments need clear goals and objectives, which is why the first steps of the scientific method devote significant time and effort to observation, to specifying the questions the experiment is intended to answer and to background research. Project managers need to invest time upfront examining the analytics project and setting the objectives. Good experiments need measurable hypotheses about the expected outcomes and controlled testing of these hypotheses. Interviewees reported spending significant amounts of time setting up the experiments and analyzing the results. What they learn forms the basis for improvements and for the next batch of experiments.

However, most interviewees recognize that a laboratory-style scientific approach is neither appropriate nor practical. They take a more pragmatic10 approach to experimentation. Still, their process has enough rigor to allow a sufficiently accurate assessment of a project’s benefits. Interviewees acknowledge that business analytics is still a relatively new area and that some companies are still learning to incorporate analytics into their business. As a business analytics coordinator at an international food retailer put it: “Our policy is clear: Not using a control group is no option. But the reality is sometimes different. Many of our people are still trying to get to grips with analytics. Conflicts won’t help our case.” Successful, experienced project managers will try to advance the learning curve and coach the stakeholders into cultural change.

5. Promoting Smart Use of Information Technology

So far, information technology has been conspicuously absent from this discussion. Yet intelligent use of IT can allow for frequent and faster iterations between the design and operating environments, and this can improve experimentation efficiency. MIT professor and global IT expert Erik Brynjolfsson, who coined the phrase “IT productivity paradox,”11 has noted that leading companies leverage IT to revolutionize the way in which they innovate by playing on four dimensions simultaneously: measurement, experimentation, sharing and replication.12 The big advantage of IT-based experimentation, he argues, is that it can trace causality in a way that would be impossible with pure measurement and observation.

IT capabilities are key to helping companies to explore as well as exploit their full potential in turbulent markets. But while business has embraced IT capabilities, it is often far less positive about the IT department. Many of the analytics project managers we interviewed identify with the business side of their organizations even if they report to the IT department. They often trace their frustration with IT to negative experiences. As one explained: “Whenever IT is involved, business analytics projects cost more and take more time than planned. They are hideously inflexible. It’s virtually impossible to go ahead with anything at short notice. And not only that, they just speak another language. They really lack the sense of urgency and pragmatism you find on the business side.”

IT departments that ignore complaints from the business side risk being circumvented. In fact, some of the analytics teams we encountered built valuable models and applications independently from the IT department. Still, bypassing the IT department altogether can be counterproductive, especially when the focus is on delivering enterprise value rather than locally optimized solutions and functional value. Certainly with enterprise value, the most appropriate modus operandi would be to approach analytics projects as partnerships between the business side and IT.13

Research suggests that best-in-class CIOs have realized that IT and business need to find better ways to work together.14 By proposing pragmatic solutions and pointing out the consequences of infrastructural decisions, CIOs can become constructive partners, enabling their businesses to make smarter choices. This means that IT should, where possible, pursue opportunities to deliver faster implementation cycles, maintaining just enough process and architectural hygiene to ensure quality and professional support.

But what is just enough process and infrastructure? Enterprise infrastructure remains a long-term investment. The big challenge is to develop a process that provides for flexible infrastructure even as the process itself — the way applications and infrastructure are built and modified — remains stable. This will require infrastructure architects with mind-sets much like those of the analytics project managers we interviewed. Indeed, these architects will need to have a strong bias toward execution, so that IT solutions and infrastructure are rooted in the present without mortgaging the future.

Two things are certain. First, the boundaries between functional domains are blurring within organizations, requiring cross-functional collaboration. Second, it will take experience-based negotiation, not theoretical design, to create just enough process and infrastructure. This is a vital area where experienced analytics project managers can put their interpersonal skills to good use.

Topics

References

1. S. LaValle, E. Lesser, R. Shockley, M.S. Hopkins and N. Kruschwitz, “Big Data, Analytics and the Path From Insights to Value,” MIT Sloan Management Review 52, no. 2 (2011): 21-32.

2. See, for example, G. Schreyögg and M. Kliesch-Eberl, “How Dynamic Can Organizational Capabilities Be? Towards a Dual-Process Model of Capability Dynamization,” Strategic Management Journal 28, no. 9 (2007): 913-933; and O.A. El Sawy and P.A. Pavlou, “IT-Enabled Business Capabilities for Turbulent Environments,” MIS Quarterly Executive 7, no. 3 (2008): 139-150.

3. T.H. Davenport and J.G. Harris, “Competing on Analytics: The New Science of Winning” (Boston: Harvard Business Press, 2007); and T.H. Davenport, J.G. Harris and R. Morison, “Analytics at Work: Smarter Decisions, Better Results” (Boston: Harvard Business Press, 2010).

4. See, for example, D. Howell, C. Windahl and R. Seidel, “A Project Contingency Framework Based on Uncertainty and Its Consequences,” International Journal of Project Management 28, no. 3 (2010): 256-264; and A. Gemino, B.H. Reich and C. Sauer, “A Temporal Model of Information Technology Project Performance,” Journal of Management Information Systems 24, no. 3 (2008): 9-44.

5. J. Highsmith, “Agile Project Management: Creating Innovative Products,” 2nd ed. (Boston: Addison-Wesley Professional, 2009).

6. The notion of “value in use” was introduced by Adam Smith in 1776. See, for example, D. Walters, “Operations Strategy: A Value Chain Approach” (Basingstoke, United Kingdom: Palgrave Macmillan, 2002).

7. See, for example, Highsmith, “Agile Project Management”; L.M. Applegate, R.D. Austin and D.L. Soule, “Corporate Information Strategy and Management,” 8th ed. (New York: McGraw-Hill Professional, 2008), 592-596; and R. Austin and L. Devin, “Artful Making: What Managers Need to Know About How Artists Work” (Upper Saddle River, New Jersey: FT Press, 2003).

8. See, for example, W.C. Kim and R. Mauborgne, “Fair Process: Managing in the Knowledge Economy,” Harvard Business Review 81, no. 1 (2003): 127-136.

9. S.H. Thomke, “Managing Experimentation in the Design of New Products,” Management Science 44, no. 6 (1998): 743-762; and S.H. Thomke, “Experimentation Matters: Unlocking the Potential of New Technologies for Innovation” (Boston: Harvard Business Press, 2003).

10. “Pragmatic” should not be confused with “unprofessional.” We use the term “pragmatic” to describe an approach that is guided by experience and observation rather than by dogma.

11. The “IT productivity paradox” implies that despite massive investment and resourcing by companies and organizations worldwide, when it comes to the value of IT there seems to be little payoff. See E. Brynjolfsson, “The Productivity Paradox of Information Technology: Review and Assessment,” Communications of the ACM 36, no. 12 (1993): 67-77; and E. Brynjolfsson and L. Hitt, “Paradox Lost? Firm-Level Evidence on the Returns to Information Systems Spending,” Management Science 42, no. 4 (1996): 541-558.

12. M.S. Hopkins, “The Four Ways IT Is Revolutionizing Innovation,” MIT Sloan Management Review 51, no. 3 (2010): 51-56.

13. See, for example, S. Viaene, “Linking Business Intelligence Into Your Business,” IT Professional 10, no. 6 (November/December 2008): 28-34; and S. Viaene, S. De Hertogh and L. Lutin, “Shadow or Not? A Business Intelligence Tale at KBC Bank,” Case Folio (January 2009): 19-29.

14. S. Viaene, S. De Hertogh and O. Jolyon, “Engaging in Turbulent Times: Direction Setting for Business and IT Alignment,” International Journal of IT/Business Alignment and Governance 2, no. 1 (2011): 1-15.

Reprint #:

53113

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comments (3)
Tim Constantine
The article is called, "The Secrets to Managing Business Analytics Projects" but in it, the most well kept secret in in all of Information Technology is revealed: "The idea is that the assets themselves have no inherent value; they generate value only when they offer specific benefits to their owners or users (for example, by allowing them to do their work differently). Consequently, only when an analytical model or application [or any other technology] is actually used can its real benefits (and costs) be identified."
stijn.viaene
Here's an article I recently published in IEEE IT Professional that nicely complements this SMR article: “Data scientists aren’t domain experts”.  You can access it here: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6674007

The IEEE article documents a data science benefits realization process composed of four main stages, i.e. (1) modelling the business, (2) data discovery, (3) operationalizing insight, and (4) cultivation of knowledge.

In selecting our interviewees for the SMR article we made sure that they all had the necessary experience with business analytics projects that included the four main stages of this benefits realization process.
jewellbruce
Excellent article, immediately brought to mind 2 keys to the "test"/analytics parts.  Firstly, my experience has been that as you noted test planning should be as detailed as reasonable.  My noted problems in this area is that the people who are testing, like to continue to test and need a definitive stopping point (the point is that good program management has to be reasonably objetive about getting data and completing analysis).  Secondly, as a program manager it has been my experience that the analytics part is typically more expensive than anyone thinks and it too suffers from inertia (bodies in motion want to stay in motion).  The most difficult part is to be flexible, but know that some discipline will be required and timely recognition and use of discipline is key.