Implement First, Ask Questions Later (or Not at All)

Companies used to spend years clarifying business requirements before they would even think of launching new software. Today, cheaper cloud-based apps mean that implementation decisions are made on the fly — and there’s no going back.

Reading Time: 7 min 

Topics

Frontiers

An MIT SMR initiative exploring how technology is reshaping the practice of management.
More in this series
Permissions and PDF Download

Facebook Inc. founder Mark Zuckerberg nicely summarized a modern philosophy about technology innovation when he spoke about the need to “move fast and break things.” Increasingly, that same mindset appears to drive how companies implement new technologies as well. And this phenomenon stretches beyond Silicon Valley.

For decades, companies required their IT teams to identify, model, and validate business requirements before writing a line of code or adopting a new technology platform, product, or service. Today, that approach seems almost quaint. Companies no longer build giant flowcharts, analyze tasks, or model business requirements in advance of deploying new technology. They just pilot and adopt — often before they have a clear idea of the business problem they’re trying to solve. Once, this launch-first mentality would have been considered heresy. Yet it has become the norm, driven by the accelerating pace of technology change, the fear of losing market share to disruptive new players, and the ease with which new technologies can be implemented through cloud-based delivery. This is a challenging environment, particularly for tradition-bound organizations. But it’s the new reality and CIOs must adapt, or they risk permanently falling behind the competition.

As part of a larger study on changes in technology implementation, my team spent two years collecting survey and interview data about the evolving relationship between business and technology. We talked to people in business roles and technology roles at companies across a range of industries. The most significant finding was the rapid death of detailed requirements analysis and modeling. Among survey respondents, 71% believed that technology can be deployed without a specific problem in mind. Just one-third said they have a clearly defined process for the adoption of emerging technology. Perhaps most surprising, half of the respondents described their pilot initiatives — small-scale, low-cost, rapid testing of new technology — as “purely experimental,” with no requirements analysis at all.

We heard a consistent theme. As one business process manager at a Fortune 100 pharmaceutical company put it, “We’ve abandoned the strict ‘requirements-first, technology-second’ adoption process, whatever that really means. Why? Because we want to stay agile and competitive and want to leverage new technologies. Gathering requirements takes forever and hasn’t made our past projects more successful.”

Different Software, Different Approach

The very idea that technologies would be acquired and deployed without documented, validated requirements flies in the face of what technology and business professionals were taught for decades in the 20th century. It was often the business side that insisted upon elaborate requirements gathering and validation. Executives frequently complained about the rush to deploy untested technologies or — worse — technologies with unverified total-cost-of-ownership (TCO) or return-on-investment (ROI) models.

Today’s adoption models assume that emerging new technologies drive requirements, not the other way around — which is why many tech solutions get discovered as part of the implementation process rather than in advance of it. Said a little differently, many companies have no clear idea what they will do with specific technologies but believe that there’s huge potential in the technology that will become clear over time and that they have no choice but to quickly adopt emerging technology if they want to digitally transform their companies to remain competitive.

This approach is possible because of the way software itself has changed. Rather than massive, enterprise-wide systems that cost millions and take years to implement, software today is cloud-based and relatively inexpensive. It often addresses highly specific problems, sometimes limited to a single business unit or department. And technology is evolving continually. As a result, companies feel they need to move fast, try a lot of things, and accept the inevitable failures. If something doesn’t work, the stakes are a lot lower — costs are measured in tens of thousands of dollars rather than millions, and timelines are a few months rather than a few years.

“We’ve piloted new devices and applications — especially mobile applications — at a quick pace,” the technology manager at an insurance company told us. “The good news is that failures happen fast and are usually cheap because of cloud delivery. The cloud changes the way we think about pilots. It makes it easy for us to ‘fail fast and fail cheap’ — something everyone likes, especially the CFO.”

This approach isn’t 100% new, of course. So-called shadow IT — in which business units go rogue and create their own work-arounds, implementing technology without the knowledge or permission of the CIO — has long plagued many companies. In the past, those efforts could have major ramifications, breaking security protocols and contaminating data sets. Today, shadow IT has essentially won. Technology at many companies is now highly decentralized — it happens at the level of individual business units, and the heads of those units have wide latitude to launch pilot tests when they spot something that might work.

As we heard from the business unit vice president at a media company, “Shadow IT short-circuits requirements analysis — which isn’t all bad, right? The business units will do what they need to do to make money, and sometimes that means they’ll adopt technology immediately if they think it might solve some problems. … There’s no way I can shut it down even if I wanted to, which I don’t.”

Little Analysis of Pilot Tests

Perhaps the most surprising finding from our analysis was that most of the companies piloting new technologies fail to quantitatively measure the impact of the pilots in terms of ROI or TCO. This is another major departure from best practices of the past, when companies had elaborate metrics in place to measure the returns on these investments. Today, the embrace of new technology can be driven by fear as much as a quest for improved performance. Companies are moving so fast that they don’t have time to gauge results.

Indeed, when we asked survey participants about the factors behind rapid technology adoption, the answers were relatively consistent across industries. (See “The Drivers of Rapid Technology Adoption.”)

Reducing costs was a big factor for companies, along with the opportunity to digitally transform themselves and roll out new business models. Yet competitive fear was the third most common factor. Companies face such a broad range of threats and disruptions, including new market entrants from a wide variety of directions, that they feel they have no choice but to jump into new technology headfirst.

Under this mindset, formal after-the-fact analyses of pilot tests miss the point, and there’s little time for them anyway. Business leaders don’t have the luxury of debriefing after a pilot to ask, “How well is this working?” If it works, they’ll know. Besides, the thinking goes, the ROI just isn’t as important when the “I” — the actual investment in new technology — is so low.

Notably, our findings show that the pressure to move fast in technology adoption is not coming from the C-suite or senior management but from business units closer to the action. The technology is changing so quickly — and affecting operational functions several layers below them in the org chart — that most senior leaders can’t keep up with recent advances, let alone develop a strategic approach to their deployment.

New Best Practices

It would be hard to find a CIO from the 1990s who would have predicted the death of formal, validated business requirements and the rise of a technology-first adoption process. Even today, this philosophy will undoubtedly anger and confuse traditional corporate budgeteers who crave precision. But we live in a different world in which speed matters more than precision, and there’s no going back.

In this world, the new best practices are to move fast, adopt early, and experiment widely. Companies should identify a specific transformation target, like supply chain planning, manufacturing operations, or customer relationship management. They should also select a few technologies, such as analytics, artificial intelligence, or location-based services. And then they should start launching pilot tests to see what works, with the goal of rapidly scaling up winning initiatives.

Business requirements may literally be unknowable until companies can try out the new technologies, and many of those pilots will fail. But the alternative — trying to move slowly and deliberately, with business requirements clearly spelled out in advance — is no longer an option. Companies should expect to discover solutions through the implementation process rather than in advance of it. They’ll break things, undoubtedly. But they’ll also stay ahead of the competition.

Topics

Frontiers

An MIT SMR initiative exploring how technology is reshaping the practice of management.
More in this series

Reprint #:

59404

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comments (2)
Adrian Wake
Another thought provoking read but it feels like a view for cash rich, larger enterprises.  I agree that the formal business requirements gathering process does not necessary lead to better outcomes but an observation I've made is while many business believe they have completed a thorough needs analysis they often lack alignment and focus against the customer and business value they are trying to increase.  While this process could definitely be streamlined regardless, the poor start inevitably leads to disappointment post implementation, particularly after the usual benefit erosion that takes place during the design and build phases.  Traditional issues aside, what I'd be interested in hearing about the survey findings is: 1) how successful have these businesses against the key drivers (cost, competitiveness, etc.)?;  2) what are the factors the successful businesses identify as must-have capability - I'm assuming there is an inherent set of skills or cultural and the switch to this approach has challenges at all levels?  Thanks for the article....
Geuko Bosker
Interesting read that raises two separate thoughts. First, how  and where is the survey performed. Background: In daily life I see a lot of differences across the globe in the way companies react to emerging technologies and the adoption in organisations. And this is not only between regions or continents but even betweem neighbouring countries. I would expect a story like this to be true for North America but almost the opposite in countries like Russia, Germany or Japan.  Second, this approach would almost immediately result in siloed data unless well governed. And siloed data is killing for your data management, especially if you look at privacy and data protection regulation like GDPR. Interesting study to see how organisations will handle this. It may not impact your ROI but it will surely impact your TCO if not managed properly. Let alone the brandname damage it might result in. Again a topic with a big cultural aspect as this is looked at differently across countries.