In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which predicted that computing power would double about every two years. For a half century, this process of doubling has proved to be so remarkably consistent that today it is commonly known as Moore’s Law and has driven the digital revolution.
In fact, we’ve become so used to the idea that our technology gets more powerful and cheaper that we scarcely stop and think about how unprecedented it is. Certainly, we did not expect horses or plows — or even steam engines, automobiles or airplanes — to double their efficiency at a continuous rate.
Nevertheless, modern organizations have come to rely on continuous improvement to such an extent that people rarely think about what it means and, with Moore’s law about to end, that’s going to be a problem. In the decades to come, we’re going to have to learn to live without the certainty of Moore’s law and operate in a new era of innovation that will be profoundly different.
The Von Neumann Bottleneck
Because of the power and consistency of Moore’s Law, we’ve come to associate technological advancement with processor speeds. Yet that is only one dimension of performance and there are many things we can do to get our machines to do more at lower cost than just speeding them up.
A primary example of this is called the von Neumann bottleneck, named after the mathematical genius who is responsible for the way our computers store programs and data in one place and make calculations in another. In the 1940s, when this idea emerged, it was a major breakthrough, but today it’s becoming somewhat of a problem.
The issue is that, because of Moore’s Law, our chips run so fast that in the time it takes information to travel back and forth between chips — at the speed of light no less — we lose a lot of valuable computing time. Ironically, as chip speeds continue to improve, the problem will only get worse.
The solution is simple in concept but elusive in practice. Just as we integrated transistors onto a single silicon wafer to create modern day chips, we can integrate different chips with a method called 3D stacking. If we can make this work, we can increase performance for a few more generations.
Optimized Computing
Today we use our computers for a variety of tasks. We write documents, watch videos, prepare analysis, play games and do many other things all on the same device using the same chip architecture. We are able to do this because the chips our computers use are designed as a general purpose technology.
That makes computers convenient and useful, but is terribly inefficient for computationally intensive tasks. There have long been technologies, such as ASIC and FPGA, that are designed for more specific tasks and, more recently, GPU’s have become popular for graphics and artificial intelligence functions.
As artificial intelligence has risen to the fore, some firms, such as Google and Microsoft have begun designing chips that are specifically engineered to run their own deep learning tools. This greatly improves performance, but you need to make a lot of chips to make the economics work, so this is out of reach for most companies.
The truth is that all of these strategies are merely stopgaps. They will help us continue to advance over the next decade or so, but with Moore’s Law ending, the real challenge is to come up with some fundamentally new ideas for computing.
Profoundly New Architectures
Over the last half century, Moore’s Law has become synonymous with computing, but we made calculating machines long before the first microchip was invented. In the early 20th century, IBM first pioneered electromechanical tabulators, then came vacuum tubes and transistors before integrated circuits were invented in the late 1950s.
Today, there are two new architectures emerging that will be commercialized within the next five years. The first is quantum computers, which have the potential to be thousands, if not millions, of times more powerful than current technology. Both IBM and Google have built working prototypes and Intel, Microsoft and others have active development programs.
The second major approach is neuromorphic computing, or chips based on the design of the human brain. These excel at pattern recognition tasks that conventional chips have trouble with. They also are thousands of times more efficient than current technology and are scalable down to a single tiny core with just a few hundred “neurons” and up to enormous arrays with millions.
Yet both of these architectures have their drawbacks. Quantum computers need to be cooled down to close to absolute zero, which limits their use. Both require profoundly different logic than conventional computers and need new programming languages. The transition will not be seamless.
A New Era Of Innovation
For the past 20 or 30 years, innovation, especially in the digital space, has been fairly straightforward. We could rely on technology to improve at a foreseeable pace and that allowed us to predict, with a high degree of certainty, what would be possible in the years to come.
That led most innovation efforts to be focused on applications, with a heavy emphasis on the end user. Startups that were able to design an experience, test it, adapt and iterate quickly could outperform big firms that had far more resources and technological sophistication. That made agility a defining competitive attribute.
In the years to come the pendulum is likely to swing from applications back to the fundamental technologies that make them possible. Rather than being able to rely on trusty old paradigms, we’ll largely be operating in the realm of the unknown. In many ways, we’ll be starting over again and innovation will look more like it did in the 1950’s and 1960’s
Computing is just one area reaching its theoretical limits. We also need next generation batteries to power our devices, electric cars and the grid. At the same time, new technologies, such as genomics, nanotechnology and robotics are becoming ascendant and even the scientific method is being called into question.
So we’re now entering a new era of innovation and the organizations who will most effectively compete will not be the ones with a capacity to disrupt, but those that are willing to tackle grand challenges and probe new horizons.