Was Moore’s Law really meant to be a law or more of a guideline?

Before I joined the semiconductor industry, I was completely unaware of this guy Gordon Moore, whose observations and predictions set in motion the course of computing hardware history, and with it the existence of all the consumer electronics that have become essential to everyday life. What I find most fascinating about this is the profound effect this theory has had on an entire world market. As it’s always referred to as Moore’s Law, I never thought to question its meaning until last week, when I read two separate articles that examined its existence and future purpose.

To be defined as a law leaves little room for interpretation. It’s a theory that’s been proven. But upon digging into the background, I discovered that what Moore proposed was not an absolute. In his initial statement, written in 1965, Moore merely made the observation that that the number of transistors placed on a chip had doubled each year, and predicted it would continue at this rate for an undetermined amount of time, but certainly for at least 10 years. In 1975, Moore actually amended his original statement to say the number of transistors would double every two years. When you think about it, Moore’s theory was more of a challenge to the industry; lending itself to a “Let’s see if we can do it” approach – comparable to stuffing as many co-eds into a Volkswagen Beetle as you can.

In a blog post I read on the Low Power Design Community, Mythbusters: Moore’s Law, Low Power and the Future of Chip Design, the blogger suggests that it’s not Moore’s Law itself that is on its way out, just the traditional methods of achieving it. He notes that companies like Intel and IBM are investigating ways to perpetuate Moore’s Law via scaling via nanotechnology, 3D transistors, and even 3D packaging.

What was more of a concern, however, was the Newsweek article that suggests that Moore’s Law doesn’t even MATTER, because even it can be achieved indefinitely, the operating systems, programming language and programmers don’t have the wherewithal to create parallel programs to capitalize on the multi-functioning capability of these multi-core chips. In essence, what’s the point of striving to perpetuate Moore’s Law if the resulting device’s full potential can’t be achieved?

Which brings us back to the VW analogy; sure, you can stuff 34 people into an old VW Beetle (27 in a new one), but does it really matter if you can’t get anywhere? The ball is clearly now in the court of the software guys. ~ F.v.T.