Lies, Damned Lies and Research Projects: Microsoft & Linux
People love winners. Thats why we love research reports: They give us winners. Moreover, thats why companies will lie, cheat and steal to win in research reports. And when they cant do that, theyll hire someone to write reports in which they win.
Lately, Microsoft Corp. has taken to this last course like a duck to water with a report from Forrester Research that showed that its cheaper to build enterprise applications with Windows and .Net then it is with Linux and J2EE. Who would have thought that a report paid for by Microsoft would say anything else?
Forrester, realizing that this kind of "research" project was
That hasnt stopped Microsoft from continuing on its course of
Most people out there will only read the headlines and think that Windows has been proven better than Linux for enterprise program development or small offices. Anyone who pays closer attention knows better. If you have to pay someone to say that your product is better than the other guys, doesnt that mean something?
Microsoft (and many other companies that sponsor this kind of research) will say they didnt dictate the results or direct the research to reach a given goal. They will say they merely supplied the money to accomplish a study. Nonsense!
This is not to say that all research is crooked. Its not. Most magazine and other independent research is done with a desire to uncover the facts, nothing less and nothing more.
Im also not saying that if Microsoft went in and demanded a study showing that Windows for Pizza 1.0 delivered 12 percent more pepperoni per pie that it would get a report saying it had 12 percent more pepperoni. I am saying that when vendors sponsor a report, they always get to put a thumb on the scales. It may be subtle; it may not. But the thumb is usually there.
Believe it or not, vendors do run competitive benchmarks and allow the results to speak for themselves. They want to be able to know that when they say, "We deliver pizza and CPUs faster than Joes Pizza and PCs," that theres a reasonable chance that no one will laugh in their faces.
However, when they find out that their server cant deliver the pizzas and Pentiums faster than Joes, you will never see those results. Paid-for-published results are always, always good for the company that bought the research. Now, sometimes the results you see out of sponsored research are fair, but in my experience, the good results usually come from apples and oranges comparisons or tweaked testing.
Lets start at the top: SBS 2003 is designed to be a one-server operation for small businesses with no more than 50 users. Microsoft chose to compare it with the entry-level departmental server RHEL ES. Both come loaded with additional applications, but SBS is set up for fast back-office deployment by novice administrators of small groups. RHEL ES delivers much the same goods, but it is more suitable for experienced administrators and can easily be used to build larger networks.
SBS 2003 does have a better upgrade path than its SBS predecessors, but RHEL ES doesnt need an upgrade path: Its already able to handle bigger loads. Looking at nothing else but these factors, a decisive issue could be whether the buyer thinks the company will grow in the four to five years while theyll be running this server. Microsoft, however, focuses on ease of deployment, not on upgrade paths. Thus, from the start, Microsoft is comparing apples and oranges.
Now lets see if we can find a tweak ... Heres a good one: One of the requirements on which operating systems are judged is configuring "an external hardware firewall/router device providing Internet connectivity." Windows users might think this comparison is fair, since Server 2003 doesnt have a decent internal firewall. Linux users know that Red Hat (and Linux distributions) already have excellent internal firewall functionality. Heres a good example of the report being tweaked in SBS favor. But how many people know both operating systems well enough to understand that? Answer: Not many.
In this report, Microsoft (via Veritest) chooses to focus on the surface. The whole report is about how easy it is to install things. Theres no mention of how well these things actually workand thats always a serious concern. Moreover, SBS is meant for small shops with little in the way of IT. In addition, SBS 2003 was in beta when these comparisons were made, and major components (including SharePoint and Exchange 2003) were also in beta. By comparison, RHES is tried-and-true, while its comparable components, PHP/Nuke and Sendmail, are battle-tested by years of use.
SBS 2003 may indeed be easier for a new administrator ... so long as theres no trouble. If there are problems (and which new programs dont have problems?), theres a lot more support already available for the Linux/open-source products.
Another problem with reports like this: The buyer gets to choose what points get played up. For example, the report praises RHEL: "The Red Hat Enterprise Linux ES 2.1 deployments did have two advantages compared to Windows SBS 2003. First, the Linux operating system and utilities, as well as third-party applications, include source code through the open-source development model. This provides support for customizing and tailoring the operating system to meet specific deployment needs. Windows SBS 2003 does not include source code. Second, even though downloading and installing third-party applications for the Red Hat Enterprise Linux ES 2.1 deployments required additional time and steps, it also provided deployment flexibility not found in Windows SBS 2003."
Sounds good, doesnt it? But those lines are at the bottom of Page Four of the executive summary. For many administrators, those comments alone make RHEL more compelling than SBS 2003, but theyre not the ones most readers will see.
I could go on, but you get the point. The report is fatally flawed for any kind of objective comparison of the two packages.
This issue isnt specific to Microsoft, though. Many companies do it. A while back, I looked at a J2EE benchmark fight between Microsoft and Oracle. Along the way, I found that BEA and Macromedia had run similar tests. Guess what? All four companies researchers declared that their company was victorious.
Need I say more?
Discuss this in the eWEEK forum.
eWEEK.com Linux & Open Source Center Editor Steven J. Vaughan-Nichols has been using and writing about Unix and Linux since the late 80s and thinks he may just have learned something about them along the way.