Performance testing should be objective, but when vested interests design the setting, beware of latent bias.
Performance testing should be objective, but when vested interests design the setting, beware of latent bias. eWeek Labs spent an extra week testing Cisco Systems Inc.s Catalyst 2950T-24 and 3550-12T switches to figure out why our performance tests showed slightly different resultsmore packet loss across a wider range of packet sizesthan those obtained at Ciscos labs. Using the same equipment, the two labs were getting different numbers.
In the end, it turned out that we were doing a more rigorous test, one that we believe is more reflective of the real world. To run our Layer 2 tests, we used Spirent Communications Inc.s SmartBits 6000B performance analysis system, equipped with 12 Gigabit-over-copper ports, in conjunction with the companys AstII traffic generation and measurement software.
Using a "full-mesh" testone with all ports talking to all other portswe found a very small but measurable amount of packet loss.
Although Cisco advised us before testing that, because of engineering design decisions, the Catalyst 3550-12T forwarded 64-byte packets (the smallest valid IP packet size) at around 93 percent of wire speed, we also found that bigger packets faced a tiny bit of trouble. We emphasize tiny because although our full-mesh tests showed some packet loss at the upper end of the scale (packet sizes greater than 1,400 bytes), all loss was less than 0.75 percent.
Using Spirents AstII, we set up static port-pair tests, where port 1 sent and received only from port 2, port 3 from port 4, and so forth. These tests produced wire-speed results for all packet sizes except 64 bytes, which jibed with Ciscos results.
Although the static-port tests that Cisco conducted are a valid measure of performance, those test conditions are unlikely to be seen in a real networking environment.
Cameron Sturdevant is the executive editor of Enterprise Networking Planet. Prior to ENP, Cameron was technical analyst at PCWeek Labs, starting in 1997. Cameron finished up as the eWEEK Labs Technical Director in 2012. Before his extensive labs tenure Cameron paid his IT dues working in technical support and sales engineering at a software publishing firm . Cameron also spent two years with a database development firm, integrating applications with mainframe legacy programs. Cameron's areas of expertise include virtual and physical IT infrastructure, cloud computing, enterprise networking and mobility. In addition to reviews, Cameron has covered monolithic enterprise management systems throughout their lifecycles, providing the eWEEK reader with all-important history and context. Cameron takes special care in cultivating his IT manager contacts, to ensure that his analysis is grounded in real-world concern. Follow Cameron on Twitter at csturdevant, or reach him by email at firstname.lastname@example.org.