This weeks announcement of the PowerMac G5 workstation at the Worldwide Developers Conference in San Francisco had many Mac enthusiasts jumping for joy. But Steve Jobs claim that the Power Mac G5 is “the worlds fastest personal computer” created much skepticism and heated discussion on the Net.
Apple claimed that the latest PowerMac will break Intels chokehold on the high-end workstation market, in terms of price and performance, and released “proof”: several benchmark charts showcasing the higher performance the G5 delivers compared with Xeon-powered workstations.
The G5 pricing will be extremely aggressive, with the entry 1.6GHz model starting at less than $2,000. And if the specifications released by Apple this week are any indication, the Power Mac G5 (to be released in August) is indeed impressive. The G5 uses the latest 64-bit PowerPC G5 processor that Apple jointly developed with IBM with speeds of up to 2GHz, with 512KB of Layer 2 cache. (IBM is also working on a 3GHz processor that will be available within 12 months.)
The 64-bit PowerMac G5 supports dual symmetric multiprocessing with a 1GHz front-side bus, and can natively run 32-bit and 64-bit applications. It can handle as much as 8GB of memory, twice the limit of conventional 32-bit computers.
The G5 also uses the latest in I/O and interconnect technologies, including support for PCI-X, 8x AGP, USB 2.0, FireWire 400 and 800, WiFi/Bluetooth, and Serial ATA.
Unfortunately, these attributes might get lost in all the noise that resulted from Apples benchmark results.
Apple hired Veritest, a third-party testing and consulting firm, to conduct the SPEC CPU benchmark that lines up the G5 against the Xeon workstations. The tests show that the G5 performs significantly better than dual processor Xeon workstations.
Hours after Apple posted the G5 SPEC CPU 2000 benchmark numbers on its Web site, skeptics flooded Internet forums questioning Apples test methodologies.
Page 2
Some claimed that Apple used “tweaks” to get the upper hand, such as using more appropriate compilers, turning off hyperthreading to slump Xeon numbers, and many more allegations. Slashdot and ExtremeTech both address the issues superbly.
Apples response from Vice President Greg Joswiak, posted on Slashdot, assures the public that Apples goal with the tests is to provide a fair comparison with full disclosure of testing parameters and methodologies.
This latest test-result brouhaha is more proof (if anyone needed it) that benchmarks dont tell the whole story, and that different benchmarks can be used to show that one platform is better than another for running a given application.
It would be very difficult to create benchmarks that are not biased toward any system design, but I found Apples claims presumptuous because we have not yet seen performance numbers from 64-bit workstations built on Intels Itanium 2 and AMDs Athlon 64. In the end, the benchmark results probably wont sway enterprise opinion: Mac enthusiasts will stay loyal to Apple, while Wintel professionals will likely stick with the x86 platform.
Well-informed IT managers and developers will base their purchasing decisions not on performance numbers alone, but also on a wide range of factors—including what each platform can offer in terms of application support, scalability and overall value.
Let the buyer beware, read the fine print, and dont be misled by glossy performance charts.
Do performance numbers tell the whole story? Share your thoughts at francis_chu@ziffdavis.com.