Parallel Computing's Already Here

Microsoft's Craig Mundie is right. Parallel computing is going to change everything about computing, but his timing is off. Thanks to Linux, it's been here for years.

The future is today. When Microsoft's chief research and strategy officer, Craig Mundie, looked into his crystal ball and predicted that parallel computing would be the next revolution in computing, he was quite correct. It's just that he was looking into the past. Parallel computing has been here and changing the world for years.

The problem with parallel computing today is that we don't tend to see it. Just as the Internet had been around for decades before the parallel rise of the CIX (Commercial Internet Exchange) and the Web in the early '90s transformed the world, most of us are blind to the changes that today's multicore processors and MPP (massively parallel processing) have already made.

CIX, which made commercial use of the Internet possible, allowed people to make money from the net. The Web was the killer application.

Today, multicore processor and MPP are already in use, just like ftp, e-mail and gopher were in use in the pre-Web Internet. Some applications, such as video editing programs like Sony Vegas Movie Studio and Nero, are already making use of dual core processors and parallel processing today.

It's not just video though. Supercomputers are made up of MPP arrays of hundred or even thousands of ordinary Intel or AMD chips running Linux. The current top supercomputer, the IBM Blue Gene/L system at the Lawrence Livermore Lab, can hit 478.2 teraflops (trillion calculations per second) with its tens of thousands of PowerPC processors running Linux. Indeed, 85.2 percent of the fastest 500 supercomputers in the world run Linux.

Besides doing things that you won't notice in your day-to-day life, like busting down the human genome into smaller and more comprehensive pieces and getting weather prediction closer to a science than an educated guess, almost every time you watch a movie with digital special effects you're seeing Linux, multicore processors and parallel processing at work.

Autodesk's Linux-based digital visual effect program Flame, for example, has been used in movies from the children's movie "Charlotte's Web" to the re-launch of James Bond in "Casino Royale" to swashbuckling in "Pirates of the Caribbean: At World's End" and, of course, science fiction thrillers like "Fantastic Four: Rise of the Silver Surfer." At the lower end, programs like Autodesk Maya are also available for Windows and Mac OS X.