As a follow up to my “Fifty year intervals” post: will changes in computing technology in the next fifty years really be as radical–or even more radical–than the “punch cards to smartphones and Google’s entire product line” transition of the past fifty years?
A simple argument for the answer being “no” is that improvements in computer hardware have generally depended on shrinking transistor sizes. In fact, the strict interpretation of Moore’s law is that it refers to transistors specifically, not computing power in the abstract. And not many people know this, but we’re rapidly approaching the point where it will be physically impossible to shrink transistors any further, because their size will be measured in atoms.
How rapidly? In a 2001 article, Ray Kurzweil (who if anything you’d expect to be over-optimistic about this issue) said that, “Moore’s Law will die a dignified death no later than the year 2019.” Somewhat more recently, Sandberg and Bostrom cite an industry report projecting current trends out to 2022. So some time around then, or not too long after, we can expect to run out of ability to shrink transistors.
However, there are proposals to squeeze more computing power out of every dollar, every pound, what have you, that don’t involve shrinking transistors. Both of the above links discuss a number of them. And there’s at least some reason to be optimistic about them; I’ve researched this issue as part of my work for MIRI, and everyone who’s looked at it seems to agree that the current exponential growth in computer power predates the integrated circuit, though there’s disagreement about whether it goes back to the early 20th century or just to WWII.
Sandberg and Bostrom write:
Pessimistic claims are often made to the effect that limitations of current technology are forever unsurpassable, or that theoretically possible technological systems such as the above will be too costly and practically difficult to ever become feasible. Still, given that computer technology has developed in a relatively stable manner despite several changes of basic principles (e.g. from flip‐flops via core memory to several semiconductor generations, from vacuum tubes to transistors to integrated circuits etc) there is no strong reason to assume they will break because current technology will eventually be replaced by other technologies. Large vested interests in continuing the growth are willing to spend considerable resources on closing technology gaps. A more likely end of growth scenario is that the feedback producing exponential growth is weakened by changes in the marketplace such as lowered demand, lowered expectations, rising production facility costs, long development times or perhaps more efficient software.
I’m curious to know what other people think of this. I can’t claim to know very much about how plausible the alternatives to the “shrinking transistors” strategy for making more powerful hardware are. Still, when I ask myself what I really think of this issue, I feel fairly confident that computing power will continue to increase fairly rapidly, if maybe not as rapidly, far into the foreseeable future.
Why do I feel so confident? I’m not entirely sure. Maybe I’m being tricked by the fact that I’m still young and Moore’s law has been in effect for my entire lifetime, so I just can’t imagine life without it. On the other hand, if enough people can’t imagine life without it, and there’s any physical possibility of pushing forward past the limit of our ability to shrink transistors, then maybe it will become a self-fulfilling prophecy.
What do you think?