Benchmarks Wanted

 
 
By Timothy Dyck  |  Posted 2003-05-12 Email Print this article Print
 
 
 
 
 
 
 

Dyck: Full disclosure is the single most important component of a good benchmark.

"Men wanted for hazardous journey. Low wages, bitter cold, long hours of complete darkness. Safe return doubtful. Honour and recognition in event of success."

That famous advertisement for Antarctic exploration is attributed to Ernest Shackleton, but it could just as well appeal to those embarking on a benchmarking program. Ive spent my share of hours in cold, dim server rooms trying to warm my hands over the flicker of drive array LEDs. Benchmarking is very hard but, if successful, worthwhile. Thats why we do it, and its why Im glad to see a significant effort righting itself.

Java training and consultancy company The Middleware Company first set sail for parts unknown in October when it published the results of a benchmark comparing two J2EE (Java 2 Platform, Enterprise Edition) application servers with Microsofts .Net 1.0 and .Net 1.1 release candidate run-times. When the results—with the .Net platform on top—came out, an uproar in the Java community ensued. The fact that Microsoft had funded the test didnt help.

Now The Middleware Company has set out on a second effort designed to be more rigorous, accurate and fair than the first, which did many things right but also had notable problems.

Two problems stood out. First, the .Net version did more data caching than the J2EE version. Second, the J2EE version was based on an old code base that still used bean-managed persistence rather than the more modern container-managed persistence.

Last month, Salil Deshpande, president of The Middleware Company, contacted me with the specifics of his companys retest plan, which fixes both these technical problems and makes two major procedural improvements as well.

First, this round is a collaborative process that incorporates feedback from a group of outside experts who have been working together on this since February. Second, the benchmark is now based on a formal specification that defines exactly how the test application needs to behave.

"We created an expert group in February and invited these experts to review a functional spec," said Deshpande. "We started with [Suns sample application] Pet Store but nailed down enough things—database tables, relationships, what data can be cached, for how long and where, what pages can look like. We fixed all the moving parts."



 
 
 
 
Timothy Dyck is a Senior Analyst with eWEEK Labs. He has been testing and reviewing application server, database and middleware products and technologies for eWEEK since 1996. Prior to joining eWEEK, he worked at the LAN and WAN network operations center for a large telecommunications firm, in operating systems and development tools technical marketing for a large software company and in the IT department at a government agency. He has an honors bachelors degree of mathematics in computer science from the University of Waterloo in Waterloo, Ontario, Canada, and a masters of arts degree in journalism from the University of Western Ontario in London, Ontario, Canada.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel