It’s no longer just Advanced Micro Devices that Intel has in its sights.
When Intel executives sat down recently to outline the upcoming spring Developer Forum, which starts April 2 in China, Pat Gelsinger, senior vice president of the Digital Enterprise Group, spent a significant amount of time talking about the graphics abilities of both “Nehalem“-the company’s next chip microarchitecture set for release later this year-and “Larrabee,” a processor with multiple cores, which will contain an integrated GPU (graphics processing unit).
These disclosures from Intel seemed to solidify what many have suspected for some time: The chip giant is prepared to move deeper into the graphics market, challenging Nvidia, long known for its range of graphics for mainstream computers and gaming desktops.
While it’s not primarily known for its graphics technology, Intel commanded 43.5 percent of all GPU shipments in the fourth quarter of 2007, while Nvidia controlled 33.6 percent, according to a January report from Jon Peddie Research.
It’s more than just the graphics that go into PCs that Nvidia and Intel are likely to clash over in the coming months. Both companies also see GPU and graphics technologies as a key feature to delve deeper into the HPC (high-performance computing) market, a lucrative field where graphics could increase overall compute performance.
“All three companies, AMD, Intel and Nvidia, are going after this emerging market for high-performance computing that includes applications for oil and gas, health care and medical imaging,” said John Spooner, an analyst with Technology Business Research. “These chips companies are looking to put supercomputer-like capabilities inside x86 boxes to run applications like crash test simulations and mechanical design. It’s a market where the customer can not get enough performance per dollar.”
That’s not to say that this is the only market where Intel and Nvidia will clash. The gaming market is another area where the two will meet with competing platforms to offer the most realistic gaming environments.
While the Larrabee chip will likely appear sometime in late 2009 or 2010 and has some mainstream applications, its initial appearance will likely land in the HPC field, said Jim McGregor, an analyst with the InStat Group.
“The type of chip that Intel described [Larrabee] is primarily for the ultra high end of the market and high-performance computing, especially if you look at the diagrams and see that they are offering an x86 core and a graphics processor that you could program like an x86 core and has the ability to perform floating point calculations,” McGregor said.
Nvidia does not want to be left out.
Intel’s Looming Battle with Nvidia
}
In June 2007, Nvidia signaled that it was ready to move out of the traditional graphics market by using its own GPU technology within HPC. The result is called Tesla, which offers 128 processing cores that work in parallel and provide more than 500 gigaflops (500 billion floating point calculations per second) of performance.
Since most developers do not create applications that work exclusively with a GPU, Nvidia also developed CUDA (Compute Unified Device Architecture), a programming language that allows the GPU to be programmed like an x86 CPU. This seems to mean that Nvidia will not need to develop its own CPU or buy a company such as Via, which makes low-watt x86 processors.
Andy Keane, general manager of GPU computing at Nvidia, said the fact that both Intel and AMD are working toward integrating the GPU onto the silicon shows that the CPU has reached the limits of Moore’s Law, the observation that computing performance doubles about every two years. Intel and AMD are trying to add performance by incorporating the graphics onto the silicon itself, Keane said.
In Nvidia reasoning, the GPU-not a traditional x86 chip with more processing cores-is the key in moving computing forward. With a GPU, Keane said that Nvidia can keep expanding the die size-Intel is moving to shrink its silicon-which allows the company to add more features onto the die, increasing performance and securing its place in HPC as well as a host of other fields.
“People will very quickly figure out that a separate GPU-a GPU that is not on the die [with the CPU]-provides a better experience in both lifestyle or graphics applications than a free GPU that’s been integrated onto the CPU,” he said.
While this sort of offering can address issues within standards PCs-discrete versus integrated graphics-Nvidia is also betting that the GPU alone can meet the needs of the HPC market.
What Nvidia is doing with Tesla is twofold.
First, it’s using the technology to increase performance within the data center by outfitting servers with a much faster processing engine. The second goal is to bring HPC to workstation PCs and bring complex, scientific applications out of the data center and to the desktop.
“How do you give scientists and engineers the ability to run a good portion of their applications at the desktop?” Keane asked. “That’s the exciting thing with HPC. All of those scientists that have had to use shared resources have watched their applications slow down …. Giving a person more and more compute power is much better than concentrating it in the backroom. The GPU is something that can do that because we are already in your PC.”
McGregor said the HPC area remains wide open with enough room for Nvidia, AMD and Intel to offer a number of competing products that look to solve problems within this field. While Nvidia focuses on the GPU, McGregor said developments such as Intel’s 80-core terascale processor and its low-watt Silverthorne core will help that company’s efforts to move further into HPC field.
Another advantage Intel has is its willingness to invest and work with the developer community.
Along with Microsoft, Intel donated $20 million to fund research into developing a new generation of developers who work with multicore, multithreaded processors, Spooner said.
“Nvidia has CUDA and Intel has their development tools and they are both trying to make it as easy as possible in order to win over developers,” Spooner said.