The Secrets to Managing Business Analytics Projects

Business analytics projects are often characterized by uncertain or changing requirements — and a high implementation risk. So it takes a special breed of project manager to execute and deliver them.

Reading Time: 15 min 


Permissions and PDF

Smart use of information technology can allow for frequent and faster iterations between the design and operating environments, improving experimentation efficiency.

Image courtesy of Flickr user BotheredByBees.

Managers have used business analytics to inform their decision making for years. Numerous studies have pointed to its growing importance, not only in analyzing past performance but also in identifying opportunities to improve future performance.1 As business environments become more complex and competitive, managers need to be able to detect or, even better, predict trends and respond to them early.2 Companies are giving business analytics increasingly high priority in hopes of gaining an edge on their competitors. Few companies would yet qualify as being what management innovation and strategy expert Thomas H. Davenport has dubbed “analytic competitors,” but more and more businesses are moving in that direction.3

Against this backdrop, we set out to examine what characterizes the most experienced project managers involved in business analytics projects. Which best practices do they employ, and how would they advise their less experienced peers? Our goal was to fill in gaps in management’s understanding of how project managers involved in analytics projects can contribute to the new intelligent enterprise. (See “About the Research.”) We found that project managers’ most important qualities can be sorted into five areas: (1) having a delivery orientation and a bias toward execution, (2) seeing value in use and value of learning, (3) working to gain commitment, (4) relying on intelligent experimentation and (5) promoting smart use of information technology.

About the Research »



1. S. LaValle, E. Lesser, R. Shockley, M.S. Hopkins and N. Kruschwitz, “Big Data, Analytics and the Path From Insights to Value,” MIT Sloan Management Review 52, no. 2 (2011): 21-32.

2. See, for example, G. Schreyögg and M. Kliesch-Eberl, “How Dynamic Can Organizational Capabilities Be? Towards a Dual-Process Model of Capability Dynamization,” Strategic Management Journal 28, no. 9 (2007): 913-933; and O.A. El Sawy and P.A. Pavlou, “IT-Enabled Business Capabilities for Turbulent Environments,” MIS Quarterly Executive 7, no. 3 (2008): 139-150.

3. T.H. Davenport and J.G. Harris, “Competing on Analytics: The New Science of Winning” (Boston: Harvard Business Press, 2007); and T.H. Davenport, J.G. Harris and R. Morison, “Analytics at Work: Smarter Decisions, Better Results” (Boston: Harvard Business Press, 2010).

4. See, for example, D. Howell, C. Windahl and R. Seidel, “A Project Contingency Framework Based on Uncertainty and Its Consequences,” International Journal of Project Management 28, no. 3 (2010): 256-264; and A. Gemino, B.H. Reich and C. Sauer, “A Temporal Model of Information Technology Project Performance,” Journal of Management Information Systems 24, no. 3 (2008): 9-44.

5. J. Highsmith, “Agile Project Management: Creating Innovative Products,” 2nd ed. (Boston: Addison-Wesley Professional, 2009).

6. The notion of “value in use” was introduced by Adam Smith in 1776. See, for example, D. Walters, “Operations Strategy: A Value Chain Approach” (Basingstoke, United Kingdom: Palgrave Macmillan, 2002).

7. See, for example, Highsmith, “Agile Project Management”; L.M. Applegate, R.D. Austin and D.L. Soule, “Corporate Information Strategy and Management,” 8th ed. (New York: McGraw-Hill Professional, 2008), 592-596; and R. Austin and L. Devin, “Artful Making: What Managers Need to Know About How Artists Work” (Upper Saddle River, New Jersey: FT Press, 2003).

8. See, for example, W.C. Kim and R. Mauborgne, “Fair Process: Managing in the Knowledge Economy,” Harvard Business Review 81, no. 1 (2003): 127-136.

9. S.H. Thomke, “Managing Experimentation in the Design of New Products,” Management Science 44, no. 6 (1998): 743-762; and S.H. Thomke, “Experimentation Matters: Unlocking the Potential of New Technologies for Innovation” (Boston: Harvard Business Press, 2003).

10. “Pragmatic” should not be confused with “unprofessional.” We use the term “pragmatic” to describe an approach that is guided by experience and observation rather than by dogma.

11. The “IT productivity paradox” implies that despite massive investment and resourcing by companies and organizations worldwide, when it comes to the value of IT there seems to be little payoff. See E. Brynjolfsson, “The Productivity Paradox of Information Technology: Review and Assessment,” Communications of the ACM 36, no. 12 (1993): 67-77; and E. Brynjolfsson and L. Hitt, “Paradox Lost? Firm-Level Evidence on the Returns to Information Systems Spending,” Management Science 42, no. 4 (1996): 541-558.

12. M.S. Hopkins, “The Four Ways IT Is Revolutionizing Innovation,” MIT Sloan Management Review 51, no. 3 (2010): 51-56.

13. See, for example, S. Viaene, “Linking Business Intelligence Into Your Business,” IT Professional 10, no. 6 (November/December 2008): 28-34; and S. Viaene, S. De Hertogh and L. Lutin, “Shadow or Not? A Business Intelligence Tale at KBC Bank,” Case Folio (January 2009): 19-29.

14. S. Viaene, S. De Hertogh and O. Jolyon, “Engaging in Turbulent Times: Direction Setting for Business and IT Alignment,” International Journal of IT/Business Alignment and Governance 2, no. 1 (2011): 1-15.

Reprint #:


More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comments (3)
Tim Constantine
The article is called, "The Secrets to Managing Business Analytics Projects" but in it, the most well kept secret in in all of Information Technology is revealed: "The idea is that the assets themselves have no inherent value; they generate value only when they offer specific benefits to their owners or users (for example, by allowing them to do their work differently). Consequently, only when an analytical model or application [or any other technology] is actually used can its real benefits (and costs) be identified."
Here's an article I recently published in IEEE IT Professional that nicely complements this SMR article: “Data scientists aren’t domain experts”.  You can access it here:

The IEEE article documents a data science benefits realization process composed of four main stages, i.e. (1) modelling the business, (2) data discovery, (3) operationalizing insight, and (4) cultivation of knowledge.

In selecting our interviewees for the SMR article we made sure that they all had the necessary experience with business analytics projects that included the four main stages of this benefits realization process.
Excellent article, immediately brought to mind 2 keys to the "test"/analytics parts.  Firstly, my experience has been that as you noted test planning should be as detailed as reasonable.  My noted problems in this area is that the people who are testing, like to continue to test and need a definitive stopping point (the point is that good program management has to be reasonably objetive about getting data and completing analysis).  Secondly, as a program manager it has been my experience that the analytics part is typically more expensive than anyone thinks and it too suffers from inertia (bodies in motion want to stay in motion).  The most difficult part is to be flexible, but know that some discipline will be required and timely recognition and use of discipline is key.