While the HPCC benchmark is more evolved, the Top500s Strohmaier said that the Application Performance Characterization-Memory Access Probe (APEX-MAP) test is set to play a larger role. The single test measures memory access performanceboth random and regular accessmost often the limiting factor in supercomputer clusters as each node tries to access memory across the shared interface. The test was developed by Strohmaier and Hongzhang Shan, also a researcher at the LLBL. "Were trying to emulate the performance of different applications," Strohmaier said. "Its been a research project, but its pretty much at the stage that we want to put it out there." The two researchers have begun to collect results and post them on the projects Web page.The real test, of course, is whether the industry will adopt the new benchmarks. "The big question is that weve called the partynow can we get anyone to come?" LLBLs Bailey said. "If the computer vendors see flaws in the benchmark, or if its too much work, the whole HPC benchmarking activity may just die right now." Check out eWEEK.coms for the latest news, views and analysis on servers, switches and networking protocols for the enterprise and small businesses.
But scientists said that while industry-standard benchmarks are useful tools, each agency uses its own collection of diagnostics that it uses to evaluate vendors. If the HPCC and other benchmarks become more widely accepted, perhaps the agencies will reduce, but not eliminate, their own tests, Bailey said. In addition to other tests, NASAs Brooks said his agency uses a parallelization routine based on actual computational fluid dynamics research, or "ocean code."