Moore's Law

Moore's Law

Back in 1965, before the industry really had any idea how to measure its rate of progress, much less the importance of knowing how to predict the pace of future innovations, Gordon Moore (co-founder of Intel) made an observation that revolutionized the technology industry and the way we think about building upon today’s technology in order to invent tomorrow’s technology.

As the director of research and development at Fairchild Semiconductor, Moore was asked to speculate on how he imagined the semiconductor components industry might develop over the next 10 years. This industry was responsible for turning silicon ingots into the intricately designed, wafer-thin discs that solid-state components and integrated circuits are made of. Each of these components comprise millions of tiny transistors, resistors, diodes, and capacitors all carefully interlaced to form the innerworkings of our modern computing technology.

Moore noted that as the manufacturing tools and processes miniaturized over time, we were able to pack more and more of these tiny components into the same amount of space. Specifically, he observed that the number of transistors that could fit on a chip roughly doubled every one to two years.

While he originally predicted only that this rate of progress would last for the next decade, incredibly, it has held true for the last 50 years! Due to its long-lasting accuracy, Moore’s observation that the density of transistors doubles at a predictable rate has more commonly been dubbed “Moore’s Law.”

Predicting the Future

"The best way to predict the future is to invent it.”— Alan Kay

Clearly, we can look back and see that the unreasonable problems of yesterday have become achievable today. Most software developed today simply would not run effectively on computer hardware developed just 10 years in the past. Using Moore’s Law, we can see that just 10 years ago, computer hardware was effectively 100 times slower than it is today!

If we can confirm the trends of the past using Moore’s Law, can we also use it to predict the future? How do computer scientists and engineers actually use Moore’s Law?

Imagine you have an idea for a new form of technology, but after careful design and analysis, you’ve estimated that today’s technology is an order of magnitude (i.e., a factor of 10) too slow to handle the massive amounts of real-time computations that your invention will need.

Using Moore’s Law, if we assume a doubling of speed approximately every 18 months, you can reasonably predict that computers will achieve the tenfold speed increase that you need in only five years. You can then begin planning the research and development of your invention over the next five years, knowing that by the time you need the technology to perform at the levels you require, it will.

This is exactly how large tech manufacturers operate. At any given point in time, their engineers are busy working on technologies that are not yet feasible, but that will become a reality by the time they are ready to introduce them onto the market.