This article, A deeper law than Moore’s?, in The Economist, 10 October 2011, reports on an analysis that shows the computing power available for a fixed amount of energy has been approximately doubling every 1.6 years since the mid-1940s.
The full technical paper is here: Implications of Historical Trends in the Electrical Efficiency of Computing.
When you first introduced the concept of Moore’s Law in one of our earlier classes, you expressed one of the very real limiting factors to this law. In our current year and computing power abilities, the functioning amount of battery life (energy efficiency) hinders Moore’s Law. In other words, you said that since our battery capabilities undermine our computational technological progression, no longer will our computing power double every two years in the future.
According to the research in this new article by Jonathan Koomey, this attempts to contradict the fact that we are limited to our battery capabilities. If this article is true, and energy efficiency and computing power are more or less equal to each other (doubling every two-ish years), will we ever see a deceleration of computer evolution? If not, will there be a point where too much is just… too much and anything extra will become just obsolete?
This is a good point. Over the long history of technology, it seems whenever it seems like the physical limits of one mechanism are reached, an new paradigm is discovered and progress resumes. For silicon semi-conductor technology, we seem to be near the physical limits of what can be done. Processor speeds have not increased over the last five years, all the improvements have been in more cores, more energy efficiency, and mostly, better software to use the cores more efficiently.
There is nothing guaranteed about these trends, but the trend of continuous improvement in technology seems to be very strong and humanity is driven to make it continue. I expect as we run out of opportunities to further improve electrical computing, other mechanisms (molecular ones like computing with DNA seem to be the most likely, but there are lots of other possibilities) will eventually become practical and replace the current technologies.
I don’t think there is any limit on the demand for computing power. We’re just not very good at imagining what to do with 1 million times the power we have today, just like most people 30 years years ago weren’t able to envision the PC revolution, and most people 15 years ago weren’t able to envision smartphones.