eWEEK Labs iSCSI vs. Fibre Channel Test Methodology

eWEEK Labs recently ran head-to-head benchmark tests comparing Fi-bre Channel to iSCSI, to see how the latter is shaping up in terms of serving as a data transport for SANs.

E-mail is an important market for iSCSI, and it is also an application whose performance depends on high-quality storage. For this reason, we chose to focus our efforts on detecting performance differences between iSCSI and Fibre Channel storage systems when hosting e-mail storage.

/zimages/3/28571.gifClick here to read eWEEK Labs analysis of its iSCSI and Fibre Channel tests.

We used Microsoft LoadSim 2003, a well-known platform for Exchange server testing and one that is fairly easy to configure. Instead of tweaking our benchmark setup for maximum performance, we configured the storage units and our servers under test with production-level settings.

To get apples-to-apples results, we used Dell/EMC AX100 and AX100i units that were identical in every way except for their storage connectivity. (The AX100 had 2G-bps Fibre Channel ports, while the AX100i had iSCSI ports.)

The storage units both came loaded with 12 250GB SATA hard drives and had two controllers each for performance and redundancy.

We ran EMCs SAFE (Storage Administrator for Exchange) to optimize the storage we used for our mail stores and transaction files. SAFE configured each of our storage arrays with a six-drive disk pool and a five-drive disk pool, each running RAID 5 for redundancy.

We could have gotten better performance going with RAID 0 and by adding the hotspare drive to the working set, but that scenario would have left us with a single point of failure at the drive level, and its one that would never be used in a real-world setting.

We also ran Microsofts Exchange Server Best Practice Analyzer to ensure that our Exchange implementation was configured cleanly. Our SUT, or server under test, was a Dell PowerEdge 2850 with twin 3.6GHz Xeon processors and 4GB of RAM. The SUT came with two integrated Gigabit Ethernet ports. We configured the first port to communicate on our Exchange domain with our domain controller and the client load generator.

The second Gigabit Ethernet adapter acted as our iSCSI HBA, and we configured that port to connect to our iSCSI subnet with the AX100i storage unit.

For iSCSI connectivity we used Microsofts iSCSI initiator 2.0 software.

On the Fibre Channel side, we installed twin Qlogic QLA200 FC2 Host Bus Adapters into our SUT, which communicated with our AX100 unit through a Brocade Silkworm 3250 eight-port Fibre Channel switch. We ran the enterprise editions of Windows 2003 Server and Exchange 2003 on our SUT.


When running our tests, we wanted to make sure that the SUT was not a bottleneck, so we used PERFMON (Microsofts Performance Monitor) to keep tabs on memory usage, CPU utilization and a number of other Exchange-specific performance metrics.

We used a Hewlett-Packard Proliant DL360 with four 2.0GHz Xeon processors and 1GB of RAM to run LoadSim 2003 and generate client load. As with the SUT, we made sure to keep track of PERFMON performance logs during the tests to make sure that the HP server was not a bottleneck.

The LoadSim 2003 test tool connects with the SUT to create virtual Exchange inboxes, mail messages and distribution lists. When running our tests, we used the MMB3 (MAPI Messaging Benchmark 3) work profile, which simulates typical user activities throughout a work day.

Each of our LoadSim runs simulated an 8-hour work day, and we removed the first and last hours when tallying our results. (We did this to get steady state results, outside of ramp-up and ramp-down.)

/zimages/3/28571.gifCheck out eWEEK.coms for the latest news, reviews and analysis on enterprise and small business storage hardware and software.