Some claimed that Apple used "tweaks" to get the upper hand, such as using more appropriate compilers, turning off hyperthreading to slump Xeon numbers, and many more allegations. Slashdot and ExtremeTech both address the issues superbly. Apples response from Vice President Greg Joswiak, posted on Slashdot, assures the public that Apples goal with the tests is to provide a fair comparison with full disclosure of testing parameters and methodologies.It would be very difficult to create benchmarks that are not biased toward any system design, but I found Apples claims presumptuous because we have not yet seen performance numbers from 64-bit workstations built on Intels Itanium 2 and AMDs Athlon 64. In the end, the benchmark results probably wont sway enterprise opinion: Mac enthusiasts will stay loyal to Apple, while Wintel professionals will likely stick with the x86 platform. Well-informed IT managers and developers will base their purchasing decisions not on performance numbers alone, but also on a wide range of factorsincluding what each platform can offer in terms of application support, scalability and overall value. Let the buyer beware, read the fine print, and dont be misled by glossy performance charts. Do performance numbers tell the whole story? Share your thoughts at firstname.lastname@example.org.
This latest test-result brouhaha is more proof (if anyone needed it) that benchmarks dont tell the whole story, and that different benchmarks can be used to show that one platform is better than another for running a given application.