What to Read Next
Already a member?Sign in
There’s been ongoing dialogue in the past few years about whether tech innovations have plateaued. While some say that we’re still in a golden age of innovation, a 2016 headline in The Wall Street Journal declared, “The Economy’s Hidden Problem: We’re Out of Big Ideas.” The article cited slower gains in science, medicine, and technology that hold back economic growth and posited that U.S. businesses may be too risk averse.
At the MIT symposium in November 2017, director of the MIT Initiative on the Digital Economy Erik Brynjolfsson answered the question, “Are we running out of inventions?” with a definitive “no.” He spoke about improvements in machine learning, from neural networks to voice recognition, and noted that there has been “a flood of research” in artificial intelligence in recent years that will likely lead to new breakthroughs.
What’s going on? Are we so jaded by technological breakthroughs that incremental innovation is discounted? Or are we facing a more serious problem?
Our latest research shows encouraging signs that new concepts have not been depleted. However, unique, original, and untapped ideas are getting more expensive to find — and that’s a problem.
A Research Productivity Gap
A range of evidence from various industries, products, and companies shows that U.S. research efforts are rising substantially while research productivity is sharply declining. Optimists hope for a fourth industrial revolution that will raise the bar again, while pessimists lament that most potential productivity growth has already occurred.
We believe that these differing views revolve around resource allocation. To maintain a given rate of economic growth, resources devoted to research must increase over time — but in many areas that’s not happening fast enough. Aggregate evidence as well as measures of research and development (R&D) productivity in specific industries — especially computers, agriculture, and medicine — illustrate this.
Take Moore’s Law, for example, which Wikipedia defines as “the observation that the number of transistors in a dense integrated circuit doubles approximately every two years.” Although this translates into an impressive increase in technical progress of 35% per year, our research finds that the effective number of semiconductor researchers has increased by around 18 times since 1971, which implies that research productivity on computer chips has in fact declined at an average annual rate of 6.8%.