Use Caution With Vendor-Oriented Tools

 
 
By Jennifer Sutherland  |  Posted 2008-04-14 Email Print this article Print
 
 
 
 
 
 
 


 

 

Vendor-Oriented Benchmarking

There are a number of vendor-oriented benchmarks that are application- or component-specific. This benchmark software can typically be obtained by customers for testing the performance of specific components. Caution must be taken, however, when using vendor-oriented tools, as bias in particular areas may be present. Plus, the ability to test across a spectrum of hardware and software configurations may not be available. Some examples of these types of benchmarks include EMC's iorate utility for testing EMC storage arrays, the Oracle Applications Standard Benchmark and Oracle's Orion.

 

Customer-Sponsored Benchmarking

Customer-sponsored benchmarking involves testing with a customer's workload or with an independent or vendor workload at the customer's site. This yields the most relevant information, as the most accurate method of determining how a configuration will perform under a particular workload is to test with that environment. This is typically done with tools from vendors such as HP/Mercury-Interactive or Rational, or with open-source tools such as The Grinder.

One benchmark or metric cannot be used for all systems or applications. A computer system is typically designed for one or more primary uses and may be incapable of performing other tasks. For example, the scientific community has evolved benchmarks that measure system performance on numeric computations that are not suitable for evaluating business applications or a database system.  Business and database software is typically dominated by the performance of software algorithms rather than by raw hardware speed. 

Benchmarking is a key step to understanding the trade-off between cost and level of service. It is a core competence of Computer Management Group members (note that the M stands for measurement).

 Benchmarking 101 was written by Jennifer Sutherland and James Yaple. Ms. Sutherland's research was done while she worked as a capacity planner for the Wisconsin Department of Health and Family Services. She can be reached at jennifere.sutherland@wisconsin.gov.

 Mr. Yaple is the Chief Technology Officer for Corporate Data Center Operations (CDCO) of the U.S. Department of Veterans Affairs. He can be reached at James.Yaple@va.gov.

Benchmarking 101 is based on a paper written by a member of The Computer Measurement Group (CMG), a not-for-profit, worldwide organization of performance and capacity management professionals committed to ensuring the quality of IT service delivery to the business. These individuals publish and present more than 100 papers a year on this and similar topics-all devoted to measuring, analyzing and predicting computer performance. To read the complete paper, located in the CMG repository, go to Benchmarking 101.

Recently, papers from past conferences have been made available to the public and are available. Click here to read these papersThese include papers on platform and application measurement, and management from distributed systems to mainframes. This includes specific technologies such as server virtualization, Java application servers, emerging server and storage technologies, and operating systems that include zSeries, Unix, Linux and Windows.

The opinions and views expressed in this article are solely those of the reviewer/writer and do not necessarily represent the opinions and views of the U.S. Department of Veterans Affairs or of the State of Wisconsin.



 
 
 
 
Benchmarking 101 was written by Jennifer Sutherland and James Yaple. Ms. Sutherland's research was done while she worked as a capacity planner for the Wisconsin Department of Health and Family Services. She can be reached at jennifere.sutherland@wisconsin.gov.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel