Wheres the Methodology?

By Larry Seltzer  |  Posted 2008-08-16 Print this article Print

The last 4 of 10 pages in the story are a review of security suites, including both pay and a "free suite" which, as I have already explained, no longer exists. In any good review, at least one which makes quantitative claims such as "Performance" you should expect a statement describing the methodology. None is included in the CR review. I was told by a vendor who received the methodology after the fact that they had to sign a non-disclosure agreement in order to see it. The rest of us have to make do with the "Guide to the Ratings" statement below the ratings chart.

The actual ratings of security suites puts BitDefender on top, which is plausible on its face. But BitDefender is the only product to get an Excellent score for Antivirus Performance, which doesn't seem right to me (scores are Excellent, Very Good, Good, Fair and Poor). Most of the other big-name products get Very Good. One would think that "Antivirus Performance" would reflect the quality of antivirus protection and perhaps something about system performance while the product is running, but what does CR mean by it? They say: "Antivirus performance mainly reflects detection of current and new viruses, with features, ease of use, and scan speed also considered."

That's a lot of important factors to lump into one score, especially since anyone who doesn't read the Guide to the Ratings will assume it measures malware detection. For the most part the review gives only the barest details of the actual malware testing, although there is one point they describe how they take old malware and make small modifications in it and see if the mods are detected. This seems interesting to me, but only in an academic sense. A real test focuses on real malware. They say they do test some, but they don't say how much and they don't say how much each product detected. Here's where I'll jump in to Symantec's defense, because I have seen many large, tests of large samples of real world malware and Symatnec's engines are always among the very best. Did its "performance" suffer in CR's review because of ease of use? Only CR knows for sure. Other performance ratings, for anti-spam for example, go similarly unexplained.

I asked Consumer Reports for a comment, although only just recently. If they decide to comment on this column I will add their comments to it and note the updates in a blog.

Congrats to BitDefender, generally recognized as good product, although my last personal experience was to lose patience with it over a false positive. It's hard to test security products in a limited time period with a limited budget. Some people are better than others at it, some are much worse.

Consumer Reports responded after this story was published. This statement is from Jeff Fox, Technology editor, Consumer Reports:

At Consumer Reports, we have always believed that scientific testing is the best way to evaluate products. We also use a statistically-valid survey methodology to measure consumer experiences. In preparing our September security reports, we employed both methods as we have for many decades. Some additional notes on this column:
  • The story was not, as you state, "filled with data sourced to eMarketer." That service provided just two pieces of data, namely the current number of Internet- and broadband-using U.S. Households
  • Using a separate credit card for online transactions avoids having to cancel your main card should fraud occur.
  • We test software against modified versions of actual malware because such threats are what security software will often be called upon to recognize on the job.
Finally, a note about your claim that Consumer Reports was invited to respond. Your e-mail to us requesting a comment was time-stamped on the same Saturday evening as your column is labeled as having posted. That left fewer than six hours to respond, on a weekend. It would have been helpful to have had more time.
It's true, as I said in the column, that I didn't give them much time to respond. I hope I can make up for that some by putting this response out now and including it in the column itself.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983. For insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzer's blog Cheap Hack.


Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel