IBM to Spend $3 Billion to Research the Future of Chips, Systems

The industry can continue to shrink conventional silicon chips to 7nm over the next several years, IBM says, but what comes after that is unclear.

IBM research

IBM will spend $3 billion over the next five years in projects that not only will help to continue to shrink current processor architecture to at least 7 nanometers, but also to fund more research into what will replace the traditional silicon chip architecture when it reaches its physical limitations.

Chip makers like IBM and Intel for years have been shrinking the circuitry on silicon processors as they've looked to improve performance, increase energy efficiency and shrink the size of systems they power. For example, Intel currently offers chips at 22nm, with plans for 14nm next year. IBM officials said that over the next few years, the miniaturization will continue to 10nm and then 7nm.

However, what's next beyond that is unclear. Eventually the circuitry will shrink to the point where "you can't build a reliable device," Bernie Meyerson, IBM Fellow and vice president of Innovation at the company, told eWEEK. "You can't make it work."

Given that, IBM is working on what might come next and will lead to new system architectures, Meyerson said. Such possibilities run from quantum computing and neurosynaptic computing to carbon nanotubes and graphene.

What IBM is doing now is something the company has done throughout its history—used its researchers to anticipate major changes in the industry and create solutions to address challenges that may be five or 10 years down the road, he said. Meyerson pointed to the shift to CMOS technology in the late 1980s and early 1990s, and then again at the end of the 1990s when company researchers decided that increasing the frequency of chips was no longer the best way to improve performance and shifted to using multiple cores in processors, releasing the first of its multi-core Power processors at the end of 2001.

"If you have a research division looking at the horizon all the time, this shouldn't come as a big surprise," he said. "We have pretty good headlights. Our headlights go a long way."

The transitions to CMOS and multiple cores were moments when IBM "needed to bet the farm," Meyerson said. This latest move represents a similar moment. "You have to make a bet," he said.

The industry for decades has worked to keep up with Moore's Law, the idea voiced by Intel co-founder Gordon Moore that the number of transistors in chips would double every 18 to 24 months. While many in the chip industry say they are keeping up with Moore's Law, it's increasingly difficult to do through the shrinking of the transistors and circuitry. And Meyerson argued that it has "been dead for a decade or so."

The challenges are many. Cloud and big data applications are putting pressure on systems for better performance, greater bandwidth capacity and more memory, while businesses also are demanding computers that consume less power. Conventional chip designs will be able to get to 7nm and maybe a little smaller, but beyond that, the challenges and complexities become daunting.

That's where the new technologies and materials like carbon nanotubes, graphene and silicon photonics, as well as new computing models like quantum and neuromorphic computing come into play.

"We needed to make a major investment here to look at the next step, a Plan B," Meyerson said, adding that IBM already has made progress in many of these areas.

Quantum computing would enable systems to process millions of calculations at the same time rather than one at a time. In traditional computing, bits can only values of "1" or "0". However, quantum bits—or "qubits"—can hold values of 1, 0, or both at the same time, opening up the possibility of systems running through millions of calculations simultaneously. Meyerson compared it to communications between humans.

"It would be frustrating to have a conversation where you could only say 'yes' or 'no,'" he said. "What if you could say 'maybe'?"