Theoretical physicist Michio Kaku is calling it: Moore’s Law ain’t what it used to be. The principle, which holds computing power doubles roughly every 18 months, is reaching its limits. In fact, in this recent Big Think video, he gives it about a decade. Nor does he hedge: “In about ten years or so, we will see the collapse of Moore’s Law. In fact, already, already we see a slowing down of Moore’s Law,” he says.
What’s the issue? Eventually, around the time they reach five nanometers, heat and electron leakage will render Pentium chips useless. Hence Intel’s shift to three-dimensional transistor technology for its next-generation Ivy Bridge chips, which will allow the company to maintain the pace for a bit longer. But Mr. Kaku holds that the end of the line is inevitable:
If I were to put money on the table, I would say that in the next ten years we’ll simply tweak Moore’s Law a bit with chip-like computers in three dimensions, but beyond that we may have to go to molecular computers and perhaps late in the 21st century quantum computers.
To us, that sounds more like the end of silicon than the end of Moore’s Law. It’s also worth mentioning that Intel is a little sunnier on the subject. MIT Technology Review recently spoke to Intel’s Mark Bohr, who heads up efforts to translate chip innovations into manufacturable reality, and he sounded relatively upbeat: “It’s becoming more challenging, but I don’t see the end [to Moore's Law].”
Anyone waiting around for the Singularity might want to take Mr. Kaku’s projections into account, however.