Standard benchmarks are one important way customers make buying decisions, and the J2EE market is an ideal level playing field for this effort. After all, where features are comparable, we need ways to discover what the best implementations are.
The Transaction Processing Council has TPC-W, a transactional Web e-commerce benchmark. Its an online bookstore Web application, and like other TPC benchmarks, it has been carefully designed. Although the TPC-W specification is precise in terms of functional requirements, it provides a lot of freedom in the choice of technologies used in tests. However, the specification even states (in Clause 1.2.10): “The use of commercially available products is encouraged” for the application program layer.
Yet, every one of the 14 posted TPC-W results (from several vendors, not just Microsoft Corp.) uses what Id call a benchmark special: Web application logic is written in C++ using Internet Server API DLLs. HTML is even hard-coded into C++ printf statements.
This practice makes TPC-W a fine hardware and database benchmark but a useless application server benchmark. Theres no reason it couldnt be a good one if vendors would only have the guts to use their own application server products in their TPC-W benchmarks (which is how they would advise a real customer to deploy a Web storefront).
Standard Performance Evaluation Corp.s application server benchmark is SpecjAppServer, a renamed, updated version of ECPerf benchmark, which was originally developed by the Java Community Process. SpecjAppServer is also a good benchmark but is specific to J2EE (Java 2 Enterprise Edition). SpecjAppServer is much narrower in focus than TPC-W because vendors must use the exact code Spec provides. This provides an excellent J2EE apples-to-apples test if application server vendors use comparable hardware, but it excludes non-J2EE vendors.
That leaves us with a number of one-off efforts, the most recent of which is a Microsoft-.Net-versus-J2EE benchmark published by The Middleware Co. in October and based on the J2EE sample application Petstore. (Visit www.middleware-company.com/j2eedotnetbench to download the code and configurations and to read the book-length message threads on the benchmark and responses from the company.)
Microsofts .Net Framework came out faster in the test, and message threads are filled with the usual personal attacks, as well as suggestions on how the Java code could have been made faster. (Two frequent suggestions are to use container-managed entity beans instead of bean-managed entity beans and to avoid Enterprise JavaBeans entirely.)
I think this was a very worthwhile benchmark project, and I applaud the effort and tuning care taken by Middleware staff. Sure, there are things that could be improved on, and, hopefully, a later benchmark will do so. But this is still a respectable step forward, especially because J2EE vendors seem to be studiously avoiding benchmarks where they can be compared with competing application server technologies.
Any benchmark (by anyone) for which source code and configuration files are published is a worthwhile benchmark in my view.