The 11 Sources of Disruption Every Company Must Monitor

Think you’re aware of the forces that might disrupt your company? Your lens may be far too narrow.

Reading Time: 15 min 

Topics

Disruption 2020

What will it take to innovate and compete over the next decade? These articles examine some of the biggest challenges companies will face, such as building the future workforce and identifying tomorrow’s disrupters. Included are contributions to MIT SMR’s special issue on disruption, published in memory of Clayton Christensen.
More in this series
Permissions and PDF Download

Image courtesy of Michael Austin/theispot.com

Recently I advised a large telecommunications company on its long-term strategy for wireless communications. The company was understandably concerned about its future. A half-dozen new streaming TV services were in the process of being launched, and bandwidth-hungry online gaming platforms were quickly attracting scores of new players. Possible regulatory actions seemed to be lurking around the corner, too.

Changes like these meant disruptions to the company’s existing business models, which hadn’t materially evolved since the dawn of the internet age. As a result, the company worried that it might be facing an existential crisis. To get in front of the risk, its senior leaders wanted to dispatch a cross-functional team to produce a three-year outlook analyzing which disruptive forces would affect the company and to what degree. It was no simple effort. First, the leaders had to galvanize internal support. At this company, any change to standard operations required lots of meetings, presentation decks, and explanations of concrete deliverables. Once they had buy-in and the cross-functional team was in place, they spent months researching the company’s competitive set, building financial models, and diving deeper into consumer electronics trends.

Finally, the team delivered on its mandate. A detailed, comprehensive three-year plan projected that new streaming platforms and online gaming would cause a drastic increase in bandwidth consumption, while newer connected gadgets — smartphones, watches, home exercise equipment, security cameras — would see greater market penetration. It was a narrow vision that would take the company down a singular path focused only on streaming and consumer gadgets without considering other disruptive forces on the horizon.

The findings were hardly revelatory. Streaming platforms, gaming, and gadgets were a given. But what about all the other adjacent areas of innovation? In my experience, companies often focus on the familiar threats because they have systems in place to monitor and measure known risks. This adds very little value to long-term planning, and, worse, it can lead to organizations having to make quick decisions under duress. It’s rarer for companies to investigate unfamiliar disruptive forces in advance and to incorporate that research into strategy.

I was curious to know how the company had initially framed its project. The objective was to investigate all of the disruptive forces that could affect telecommunications in the future, yet it had really focused only on the usual known threats.

There were plenty of outside developments worth attention. For example, some clever entrepreneurs had already deployed new systems to share the computer processing power sitting dormant in our connected devices. Using a simple app, consumers were selling remote access to their mobile phones in exchange for credits or money that can be spent on exchanges. (This literally allows consumers to earn money while they sleep.) Since the systems are distributed and decentralized, private data is safeguarded. On these new platforms, anyone can rent their spare computation resources for a fee.

What’s most interesting about distributed computing platforms is that they can also harness the power of other devices, like connected microwaves and washing machines, smart fire alarms, and voice-controlled speakers. As distributed computing platforms move from the fringe to the mainstream, this would have a seismic impact on the telecommunications company’s financial projections. While the team was accustomed to calculating the cost per megabit for streaming and the cost to maintain its networks, it didn’t have formulas to calculate the financial impact of billions of connected devices that could soon be a part of giant distributed computing platforms.

Looking at the future of telecommunications through the lens of distributed computing, I had a lot of follow-up questions: How should existing bandwidth models and projections be revised to account for all of these devices? Would customer plans still earn the same margins with all these new use cases for existing bandwidth? Would the company mine all of the device data for business intelligence? If so, what would data governance need to look like?

I also asked the team to think about the future of telecommunications through another adjacent lens: climate change. Existing data centers, like all buildings, were developed using guidelines, architectural plans, and building codes that will likely need to change in response to severe weather events. Data centers must be housed inside temperature-controlled environments that never deviate. Heat waves, flash floods, hail, high winds, and wildfires have become more common — and harder to predict. This poses a threat to critical infrastructure.

While the team could build predictive models to anticipate bandwidth spikes, predicting extreme weather events would be far more difficult. How was the team tracking weather and climate? Had they built uncertainty into their financial projections to account for extreme weather events? Was there a crisis plan ready to implement if the power got knocked out? What if a long stretch of exceptionally hot days strained the air conditioners? Did it make sense for the company to continue building and maintaining data centers? Was there a case to be made for adding a small team of climate scientists to the company’s existing data science unit?

I could see from everyone’s reactions that this line of exponential questioning was beyond the typical scope of their research. The reason the company had not considered these and other areas of potential disruption had to do with its entrenched habits and cherished beliefs. The team was accustomed to a rigorous — but narrow — approach to planning. They built financial projections, tracked their immediate competitors, and followed R&D within their industry sector. That was it.

What I observed is hardly unique. When faced with deep uncertainty, teams often develop a habit of controlling for internal, known variables and fail to track external factors as potential disrupters. Tracking known variables fits into an existing business culture because it’s an activity that can be measured quantitatively. This practice lures decision makers into a false sense of security, and it unfortunately results in a narrow framing of the future, making even the most successful organizations vulnerable to disruptive forces that appear to come out of nowhere. Failing to account for change outside those known variables is how even the biggest and most respected companies get disrupted out of the market.

Futurists call these external factors weak signals, and they are important indicators of change. Some leadership teams lean into uncertainty by seeking out weak signals. They use a proven framework, are open to alternative visions of the future, and challenge themselves to see their companies and industries through outside perspectives. Companies that do not formalize a process to continually look for weak signals typically find themselves rattled by disruptive forces.

As a quantitative futurist, my job is to investigate the future, and that process is anchored in intentionally confronting uncertainties both internal and external to an organization. I do this using what I call the future forces theory, which explains how disruption usually stems from influential sources of macro change. These sources represent external uncertainties — factors that broadly affect business, governing, and society. They can skew positive, neutral, or negative.

I use a simple tool to apply the future forces theory to organizations as they are developing strategic thinking. It lists 11 sources of macro change that are typically outside a leader’s control. (See “The 11 Macro Sources of Disruption.”) In 15 years of quantitative foresight research, I have discovered that all change is the result of disruption in one or more of these 11 sources. Organizations must pay attention to all 11 — and they should look for areas of convergence, inflections, and contradictions. Emerging patterns are especially important because they signal transformation of some kind. Leaders must connect the dots back to their industries and companies and position teams to take incremental actions.

The 11 sources of change might seem onerous at first, but consider the benefit of a broader viewpoint: A big agricultural company tracking infrastructure changes could be a first mover into new or emerging markets, while a big box retailer monitoring 5G technology and artificial intelligence could be better positioned to compete against the big tech platforms.

Sources of macro change encompass the following:

1. Wealth distribution: the distribution of income across a population’s households, the concentration of assets in various communities, the ability for individuals to move up from their existing financial circumstances, and the gap between the top and bottom brackets within an economy.

2. Education: access to and quality of primary, secondary, and postsecondary education; workforce training; trade apprenticeships; certification programs; the ways in which people are learning and the tools they’re using; what people are interested in studying.

3. Infrastructure: physical, organizational, and digital structures needed for society to operate (bridges, power grids, roads, Wi-Fi towers, closed-circuit security cameras); the ways in which the infrastructure of one city, state, or country might affect another’s.

4. Government: local, state, national, and international governing bodies, their planning cycles, their elections, and the regulatory decisions they make.

5. Geopolitics: the relationships between the leaders, militaries, and governments of different countries; the risk faced by investors, companies, and elected leaders in response to regulatory, economic, or military actions.

6. Economy: shifts in standard macroeconomic and microeconomic factors.

7. Public health: changes occurring in the health and behavior of a community’s population in response to lifestyles, popular culture, disease, government regulation, warfare or conflict, and religious beliefs.

8. Demographics: observing how birth and death rates, income, population density, human migration, disease, and other dynamics are leading to shifts in communities.

9. Environment: changes to the natural world or specific geographic areas, including extreme weather events, climate fluctuations, rising sea levels, drought, high or low temperatures, and more. Agricultural production is included in this category.

10. Media and telecommunications: all of the ways in which we send and receive information and learn about the world, including social networks, news organizations, digital platforms, video streaming services, gaming and e-sports systems, 5G, and the boundless other ways in which we connect with each other.

11. Technology: not as an isolated source of macro change, but as the connective tissue linking business, government, and society. We always look for emerging tech developments as well as tech signals within the other sources of change.

This may seem an unreasonably broad list of signals to track to prepare for the future, but in my experience, ignoring these potential sources of change leaves organizations vulnerable to disruption. My favorite example of what comes to pass when companies ignore these signals happened in 2004, when there were a number of emerging weak signals that pointed to a drastic shift in how people communicated. Two senior leadership teams had access to the same information. One looked for external factors actively, while the other simply used trends within its industry to make incremental improvements to its existing suite of products. Those decisions would result in the end of one of the world’s most loved and respected companies and the rise of an unlikely competitor that no one saw coming. The signals included the following developments:

  • New software made it easy for anyone to rip content from CDs and DVDs.
  • Peer-to-peer file sharing websites, like BitTorrent, isoHunt, The Pirate Bay, and LimeWire, that were first used by hackers had become popular with ordinary people who were sharing music and movies widely.
  • Demand for digital content was growing fast; sales of physical media were starting to decline.
  • Game developers were experimenting with haptic technology that responded to pressure and touch. In a combat game, for instance, when a player got hit by enemy fire, they’d feel the controller buzz. Developers were also building haptics into early touch screens: Players could simply touch an icon to advance, move back, turn, or stop.
  • In Korea and Japan, consumer gadgets were being built with dual functions: There were digital cameras with MP3 players; cellphones had retractable metal antennas to receive broadcast TV signals.

One of the senior leadership teams connected those signals with its existing work and foresaw a world in which all of our existing devices converged into just one mobile phone that had enough power to record videos, play games, check email, manage calendars, show interactive maps with directions, and much more. That team had no cherished beliefs about the existing form factor of our mobile phones and was willing to accept alternative ideas for how a computer-phone could work. That team worked at Apple, and in 2007, a product that had baked all of those weak signals into its strategy went on sale: the first iPhone. By the end of the decade, a company that once was mostly known for its sleek desktop computers had forced the entire mobile device market to bend to its vision of the future.

By contrast, these very same weak signals never caught the attention of Research in Motion (RIM), which at the time made the world’s most popular phone, the BlackBerry. (In fact, we loved their phones so much we called them crackberries and were proud of our digital addictions.) It was the first device that allowed us to stay truly connected to the office. Perhaps most important, it had a full, physical keyboard. All other phones at that point simply had numbered buttons; to type letters required hitting a few buttons to access one of the three letters assigned to each number. Before the BlackBerry, a simple three-line text message could take several minutes to type.

Because of the BlackBerry’s enormous popularity, RIM had become one of the largest and most valuable companies in the world, valued at $26 billion. It controlled an estimated 70% of the mobile market share and counted 7 million BlackBerry users. With its great run of success, the organization’s culture did not allow for alternative versions of the future, and internally, there was an aversion to contradicting cherished beliefs. Managers who did connect those weak signals to the BlackBerry didn’t have credibility outside their departments. As a result, all of the disruptive external forces Apple was actively tracking never broke through to the senior leadership team of RIM. RIM continued innovating narrowly, selling a smaller BlackBerry Pearl with a tiny, pearl-shaped mouse embedded in the keyboard and releasing BlackBerries in new colors. It was, in hindsight, the defensive strategy that Clayton Christensen explained in his Theory of Disruptive Innovation. Threatened by a disruption, incumbents retreat to the strategy of what Christensen called sustaining innovations — new bells and whistles that allow the incumbent to keep its customer base and, more important, its profit margin. But such innovations virtually ignore the disruptions breaking into the incumbent’s market.

Once the iPhone launched, Apple kept listening for signals while RIM never recalibrated its strategy. Rather than quickly adapting its beloved product for a new generation of mobile users, RIM continued tweaking and incrementally improving its existing BlackBerries and its operating system. That first iPhone was in many ways a red herring. As is so often true with successful disrupters, the first product to break through is often low quality and barely “good enough” for consumers. That’s what enables incumbents to justify ignoring them. But the ascent to quality is rapid. Apple swiftly made improvements to the phone and the operating system. Soon it became clear that the iPhone was never intended to compete against the BlackBerry. Apple had an entirely different vision for the future of smartphones — it saw the trend in single devices for all of life, not just business — and it would leapfrog RIM as a result.

The ways in which RIM and Apple planned their futures are what sealed their fates, and what happened to RIM is a warning that applies to every organization. Senior leaders can choose to lean into uncertainty and methodically track disruptive forces early, or they can choose to innovate narrowly and reinforce established practices and beliefs.

Many companies around the world use the future forces theory to help them make sense of deep uncertainty and break free from the tyranny of narrow innovation. Some use it at the start of a strategic project, while others use it as a guiding principle throughout their work streams, processes, and planning. The key is to make a connection between each source of change and the company and also to ask questions like Who is funding new developments and experimentation in this source of change? Which populations will be directly or indirectly affected by shifts in this area? Could any changes in this source lead to future regulatory actions? How might a shift in this area lead to shifts in other sectors? Who would benefit if an advancement in this source of change winds up causing harm?

I have seen the most success in teams who use the macro change tool not just for a specific deliverable but to encourage ongoing signal scanning. One multinational company took the idea to a wonderful extreme: It built cross-functional cohorts made up of senior leaders and managers from every part of the organization all around the world. Each cohort has 10 people, and each person is assigned one of the sources of macro change, along with a few more specific technology topics and topics related to their individual jobs. Cohort members are responsible for keeping up on their assigned coverage areas. A few times a month, each cohort has a 60-minute strategic conversation to share knowledge and talk about the implications of the weak signals they’re uncovering. Not only is this a great way to develop and build internal muscles for signal tracking, it has fostered better communication throughout the entire organization.

It might go against the established culture of your organization, but embracing uncertainty is the best way to confront external forces outside your control. Seeking out weak signals by intentionally looking through the lenses of macro change is the best possible way to make sure your organization stays ahead of the next wave of disruption. Better yet, it’s how your team could find itself on the edge of that wave, leading your entire industry into the future.

Topics

Disruption 2020

What will it take to innovate and compete over the next decade? These articles examine some of the biggest challenges companies will face, such as building the future workforce and identifying tomorrow’s disrupters. Included are contributions to MIT SMR’s special issue on disruption, published in memory of Clayton Christensen.
More in this series

Reprint #:

61309

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comment (1)
Diana Niculae
Hello, Amy
Though you observed, in the given example, that, between those two players monitoring weak signals, the successful difference was the culture (organizational culture, in their particular case)... I do not see it as a field "per se" worthy of being monitored for weak signals. Could this be the missing 12? I would monitor it for it has the power to permeate, influence or link to all the other fields. I speculate that a cultural shift could resemble
a lighthouse - one that we can actually see before the economic & political changes/ struggles actually appear. 
But maybe this point of view is too far stretched into the future and we might need to use our imagination too much - considering that the global average for a company's lifetime is 50 years, which is ironic compared to the fact that there are cultural artifacts out there that still influence humanity for thousands of years... Nevertheless, I believe that a culture of cooperation & diversity will nurture & ask for more IMAGINATION. :)