Life after Moore's Law


— 9:20 AM on September 9, 2011

Recently, I've been turning the pages of Michio Kaku's new book, Physics of the Future. While much of its content echoes what was said in his earlier work, Physics of the Impossible, one section in particular grabbed my attention: Kaku's discussion regarding the end of Moore's Law.

This topic has received no shortage of attention. Pundits have predicted the end of Moore's Law ever since its inception some 46 years ago. Even so, few people today seem to agree on a precise day of reckoning. Kaku believes that by 2020, or shortly thereafter, transistors will run up against their atomic size limits and Moore's Law will break down. Years ago, Intel predicted this event would occur at a 16-nm process node with 5-nm gates, yet it has plans on the table for 15-nm and 11-nm process nodes going forward. When wires and gates get too small (about five atoms thick), electrons begin to stray from their dictated paths and short circuit the chip. This issue makes shrinking the transistors further a futile endeavor.

When transistors can no longer be made smaller, the only way to continue doubling the transistor count every two years is to build upward or outward. Stacking dies poses challenging heat dissipation and interconnect problems, while making larger dies and linking them together in a single package is only sustainable up to a point. Indeed, with 22-nm production ramping up, it seems that the zenith of silicon-based IC design is finally a legitimate point on the horizon.

The paranoid among us may see the recent netbook mantra of "good enough computing" as a ploy by CPU manufacturers attempting to acclimate users to an impending period of diminishing performance returns. More interesting to me than when we break the law, though, is what the consequences will be as advancements in cheap computational power begin to level off. Will research and advancement in other areas, such as genomics, decline at the same rate as commodity computing power? Will longer silicon refresh cycles pour salt in the wound of an already ailing world economy? Will people even notice or care?

I believe those of us in the enthusiast realm will indeed notice when transistor counts begin to level off, but for the vast majority of everyday users, it probably won't matter. As we near the end of the decade, the bulk of our daily computing burden will be likely be removed from our desks, and placed on the backs of "cloud" companies that handle the heavy lifting. This arrangement will provide a comfortable layer of insulation between the masses and the semiconductor manufacturers, as scaling issues will be dealt with quietly in the background, giving users get an end-product that "just works." You'll still be able to find laptop-toting hipsters at your local coffee spot, and locally installed software will still be commonplace. But by the end of the decade, the size of your Internet pipe may be more important than the speed of your processor.

As transistor counts stagnate, a combination of clever parallel programming techniques and engineering tricks at the silicon level will become even more important than they are today. These tweaks will be required to keep the industry moving forward until a post-silicon computing era can take root. There are several prospects on the radar to replace traditional silicon chips, including graphene, light-based logic circuits, and quantum processors. The next big thing beyond silicon is still anybody's guess, though.

The biggest questions in my mind, however, revolve around the world economy and its reaction to silicon scaling issues of the future. Will the mantra of "good enough," coupled with incremental improvements over time, be sufficient to stave off a meltdown in the tech sector? Will device upgrade cycles lengthen, or will users continue to purchase new toys at the same rate, even though they aren't much faster than their predecessors? Will software begin to outpace hardware, creating enough computational scarcity that market forces drive efforts to advance computing to the next level?

There are a lot of unknown variables at play here, and the capital required to research and develop silicon's successor is staggering. Further damage to the financial sector could potentially slow down progress toward the post-silicon era if R&D funding dries up. Similarly, the fallout from various government debt crises could limit future investments in technology, despite immense interest in quantum computing for cryptography.

Before this starts sounding too much like a doom-and-gloom, fear-mongering editorial, it should be understood that the end of Moore's Law does not spell out the end of all advancement. It merely suggests that the gains we're used to seeing come online every two years will be on a slower schedule. Progress will march on, and there will be ample time to develop new technologies to pick up where silicon leaves off. Of all the items on the post-silicon wish list, the technology that makes me feel the most warm and fuzzy inside is superconductors—specifically, the potential discovery of a room-temperature superconductor.

A superconductor is a material that loses all electrical resistance when cooled to a certain temperature. In theory, assuming an absence of outside forces, electrons could zip around a superconducting ring forever with no loss of energy. Materials have already been discovered that lose all resistance at temperatures easily attainable using cheap liquid nitrogen. However, finding a material that does so at room temperature would represent the holy grail for scientists and electrical engineers. Imagine a processor whose interconnects and transistors were crafted from a superconducting material. Such a beast would be able to operate with next to no electrical leakage and minimal heat generation despite running at insane clock speeds. There are many other novel and mind-boggling uses for superconductors, particularly in the transportation sector, but such a discovery would be a huge boon to the technology world. There is nothing out there that proves room-temperature semiconductors actually exist, but the end of Moore's Law could serve as incentive to ramp up the search efforts.

Looking ahead to a time when chip makers are no longer able to significantly shrink transistors, how do you think the world will cope? Personally, I predict things will be business as usual in a post-Moore world. Worst case scenario, compute power will expand to meet demand inside of large cloud-based server farms, where performance can grow by adding more CPU's to the mix and use is metered and billed like a utility. Best case scenario, we'll all eventually be playing holographic Crysis on personal desktop quantum computers or shattering clock speed records with cool-running, superconducting CPUs.  For those interested in computing and the physics behind it, the next couple decades should provide quite a show. Who else wants some popcorn?

Like what we're doing? Pay what you want to support TR and get nifty extra features.
Top contributors
1. GKey13 - $650 2. JohnC - $600 3. davidbowser - $501
4. cmpxchg - $500 5. DeadOfKnight - $400 6. danny e. - $375
7. the - $360 8. Ryszard - $351 9. rbattle - $350
10. Ryu Connor - $350
   
Register
Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.