Among the digerati, some of the leading sources of high anxiety are the perennial predictions that Moores Law is destined for a head-on collision with the laws of physics. Such predictions are bear bait on Wall Street, where the premise of Moores Law—the doubling of computer power every 18 to 24 months—is widely accepted as a prescription for uninterrupted market growth and steady improvement in worker productivity.
The doomsayers point to the fact that further reductions in the size of silicon transistors—the key factor in perpetuating Moores Law over the last four decades—will hit physical limits dictated by physics about 2015. But transistor density and the physical properties of silicon are only two of the many variables determining how much power can be packed into a given space or delivered for a given number of dollars.
"There is life after silicon," said Philip Wong, senior manager for exploratory devices at IBM Research, in Yorktown Heights, N.Y. "On the other hand, the life of silicon itself may be a lot longer than what people have been led to believe."
So, what are we trying to describe when we invoke Moores Law?
If the issue is narrowly defined as transistor density on a silicon chip, no one denies that fabrication will soon reach limits at which thresholds only a few atoms thick, probably just under 9 nanometers, fail to control the flow of electrons, thus negating the semiconducting properties of silicon.
But silicon transistors are only one type of logic gate, the on-off triggers that do the actual processing. More often than not, Moores Law is invoked in a larger sense to describe exponential growth in raw computational power, irrespective of its underlying technology. In that sense, innovations in basic chip architecture and three-dimensional circuit design as well as new paradigms such as molecular transistors, carbon nanotube gates and quantum computing will probably continue for many decades to produce growth at roughly the same or an even greater exponential rate than the curve described by Moores Law.
In fact, inventor and author Ray Kurzweil said he believes we can expect the process to accelerate at a double exponential rate.
"Moores Law was not the first but the fifth paradigm to provide exponential growth to computing," said Kurzweil, in Cambridge, Mass., who has calculated the rise in computing power since 1900 for his forthcoming book titled "The Singularity Is Near." The first four paradigms, he said, were electromagnetic, punch-card-based calculators used in the 1890 census; relay-based computers, most notably Alan Turings machine for cracking the Nazi Enigma code; vacuum-tube computers commercialized in the early 1950s; and discrete transistor-based machines such as the computers used in the first NASA launches.
The resulting curve, Kurzweil said, suggests an exponential continuum along which Moores Law accounts for a relatively small stretch of intellectual real estate. This larger continuum, which is coming to be known in some futurist circles as Kurzweils Law of Accelerating Intelligence, or simply Kurzweils Law, foresees faster growth in computational power over the next several decades than Moores Law predicts.
"The next paradigm, the sixth, will be three-dimensional molecular computing," Kurzweil said. "In the past year, there have been major strides, for example, in creating three-dimensional carbon nanotube-based electronic circuits."
Leading-edge research supports the uninterrupted continuum described by Kurzweils Law. For example, last October, Lucent Technologies Inc.s Bell Labs, in Murray Hill, N.J., where the transistor was invented in 1947, announced that two of its scientists, Hendrik Schon and Zhenan Bao, had succeeded in fabricating molecular-scale organic transistors.
And at IBM Research and other labs, Wong said, researchers "have demonstrated that you can make a transistor out of these materials and yet have a performance that is projected to be very competitive with where we expect silicon technology will be in about 10 years."
Not surprisingly, most of the excitement generated by that announcement focused on new life for Moores Law by reducing the size of transistors to as little as 1 nm. But a largely overlooked aspect of the breakthrough was its impact on what is often referred to as Moores Second Law—Intel Corp. founder Gordon Moores observation in 1995 that while the power of chips doubles every two years, the cost of fabricating them doubles every three years.
This aspect of chip evolution is often overlooked because its assumed that the ratio of performance to cost will ensure a healthy market as long as gate density is always expanding faster than fabrication costs. So far, that logic has held. A discrete transistor cost between $5 and $45 to make in the 1950s. Todays IC (integrated circuit) transistors cost less than a hundred-thousandth of a cent. For all practical purposes, they are free, or at least incidental to the cost of a device.
Even so, chip makers could reach their financial limits before they reach the physical limits of silicon. If fabrication costs continue rising at the current rate, they could outstrip the capital resources of even the largest chip makers by the middle of the next decade, especially as competitive pricing squeezes margins.
This is where molecular transistors may have their biggest advantage. By fusing physics and chemistry in a process known as chemical self- assembly, critical portions of a chip can basically be fabricated in a beaker, doing away with the need for the increasingly exotic lithography and clean rooms that have driven the costs of chip production into the stratosphere.
Assuming that researchers hit no major barriers in constructing circuits from molecular transistors, in the next 15 years we could expect to see density increases in the neighborhood of 106 times todays most advanced silicon chips—a threshold of computing power that could support speech, sensory and decision-making functions approximating human intelligence. And it probably could be done with a significant collateral decrease in cost per transistor.
The resulting impact on device development and IT operations is expected to be enormous—but in ways that may not be obvious today and that are likely to greatly expand ITs responsibilities beyond todays traditional server/client application focus.
"Sometimes, you hear the feeling expressed that computational power has gotten ahead of user applications and software usability," IBMs Wong said. "On the other hand, if you look at the things that people desire to do with technology, there are many things we could do with increased computing that people want to do today. For example, on-the-fly language translation and instant communication in a small form factor, like computation-type watches. ... Looking into the future, you will be able to march down the Moores Law curve and do all these kinds of devices while reducing their form factor and the power consumption."
As Wong uses the term, Moores Law is a technology-agnostic predictor of logic gate densities that morphs smoothly into the larger continuum of Kurzweils Law as silicon gives way to the nanotube, which in turn gives way to quantum or other computing paradigms that might seem like science fiction today.
So, increasingly, prophecies about Moores Law are just semantic exercises. As widely used today, the term is an inflated label for what was originally nothing more than a casual observation by Moore in 1965 to suggest healthy growth over a single decade for the then-nascent silicon chip industry. That was five years before Moore cofounded Intel.
The worlds most powerful IC in 1965 was made by Moores employer at the time, Fairchild Semiconductor International, of South Portland, Maine. It contained 64 transistors, and his observation was based on a mere three cycles of growth—from a single transistor in 1959 to 32 transistors on a chip in 1964 to 64 transistors in 1965.
Yet, over almost four decades, his observation has proved to be surprisingly—and profitably—accurate. By 2000, Moores 64-gate chip had evolved into the Pentium 4 processors 42 million transistors. More recently, Intel, in Santa Clara, Calif., demonstrated the worlds smallest silicon transistor, with a gate length of 15 nm.
That kind of uninterrupted progress has further cemented Moores cycle of expanding computer power as a cornerstone of our culture and economy, something akin to a constitution for the information age. Much of the world has come to believe that Moores Law has driven ITs impact on the economy and on productivity in the last decade. In July last year, the Joint Economic Committee of Congress reported that the nations productivity had almost doubled from 1995 through 2000 and concluded that "at least half of the recent increase in labor productivity ... is attributable to IT."
In fact, the assumption of exponential increases in raw processing power has become so key to anticipating growth in productivity that the committee felt compelled to warn, "The efficiency gains in IT production, particularly semiconductors, will eventually run into physical constraints; Moores Law cannot hold indefinitely."
Fortunately, that prediction—when extrapolated beyond conventional semiconductors—probably wont be tested for at least several decades and possibly for as long as there are human beings. While no one has ever seriously claimed that Moores Law is a basic rule of nature, Kurzweil argues that the continuum of exponential growth in machine intelligence is fundamental to human evolution.
"Exponential growth and acceleration are inherent in any evolutionary process," Kurzweil said, "and they characterize both biological and technological evolution."