I don’t do a lot of reviews anymore, but I spent about 13 years in the reviews business, testing a wide variety of products. A badly-done, badly-thought out review hits me like fingernails on the blackboard. So it is with the recent Consumer Reports story on computer security and accompanying review of computer security software (the full story is available only to subscribers).
[Editor’s Note: Consumer Reports responded to this story after it was published. Please see the end of the story for the full text of their response.]
It’s normally bad form to dump on another publication’s work in the same field, but this particular story really bothers me. I knew about it before and I must admit that I was spurred into writing this story by a blog posting by Symantec’s David Cole complaining about the review. I’ve had my share of bad experiences with Symantec products so I’m not inclined to give them free PR, but Cole’s points are quite valid.
Conveying security information to lay people is a tricky business, much harder than with most other technology issues. Those lay people will be inclined to trust Consumer Reports, which has a sterling reputation, whether they deserve it or not. Bad information coming from such a trusted source thus becomes doubly bad and end up making things hard for everyone, even those of you in IT, as those lay people bring their false impressions with them to work.
To the review: The first, most ridiculous problem with the review is timing. This is in the September issue of CR which necessarily comes out in early August, and for which the testing was probably finished by early July, probably even earlier than that. Because of this schedule, CR reports on the 2008 editions of the security suites. But the new versions of the software suites come out in the fall in or around September. I’m scheduled to talk to two vendors this week about their impending 2009 editions. And since the entire industry has moved to a subscription model testing old versions makes even less sense. On this same subject, the ratings page in the review includes one last finger in the reader’s eye as the “free suite” CR builds for comparison to the pay suites includes Avira Personal Edition Classic 7 which, a footnote adds, is “Discontinued; replaced by Free Antivirus 8, which claims enhancements.” For these reasons alone, the review is essentially useless out of the gate.
Wheres the Review?
The first several pages of the story are not a review, but a story on online threats and how you can protect yourself (“Protect Yourself Online. The Biggest Threats & The Best Solutions”). It’s filled with data sourced to eMarketer, a market research firm. The data in the story, including all of the claims of prevelance of threats and so on, therefore come from self-reported survey results from the general public.
As an IT professional-as I assume most eWEEK readers are-ask yourself if you would trust the average computer-using consumer to accurately report what security software features they are running and what threats they have suffered from. Me, I wouldn’t trust them, but CR does. Thus we learn that “…the rate of serious spyware problems has declined 54% and serious virus problems 32% over the years that we’ve tracked them.” The only malware terms the story uses are spyware and viruses, so I will assume that “virus” here also means trojan horse malware, the dominant form, and I’m pleasantly surprised to learn that it’s been on the decline for a while, as has spam according to the story.
On the other hand, it reports that 1 in 94 households had monetary losses from a phishing attack in the past 2 years. That sounds high to me. Mind you, another research report from eMarketer, dated August 6, claims that the online identity theft threat is overstated: “The actual risk of having your identity stolen online is not as high as many people think,” said Susan Menke, senior analyst at Mintel, in a statement. “Financial services companies are trying to reassure consumers, but their marketing messages aren’t sticking. Companies need to find innovative new ways to convince Americans that their identities are secure online and when using e-mail.” I’m confused; is it a big threat or not?
The pitfalls of having end users report on security create havoc all over the story. When the survey shows that “36 percent didn’t have an antispyware program, and 75% didn’t use an antiphishing toolbar” I’m extremely suspicious. Many who are running anti-virus software probably think they don’t have an antispyware program, yet they certianly have a great deal of antispyware protection, and in fact the distinction between anti-spyware and anti-virus has always been a phony one. And while neither IE7 nor Firefox 3 have an antiphishing toolbar as such, both have live antiphishing protection. Is CR saying that 75% of users are not using those versions? I suspect not. So are they saying it’s better to have it in a toolbar? Right. The only reasonable conclusion from this data is that CR’s authors and editors don’t understand how security software or browsers work these days.
The story also has numerous tips for users to avoid security problems and a list of 7 blunders users make. Many of these are well thought out, but a few are overstated or just plan bewildering. Consider this advice: “Use a separate credit card just for your Internet shopping, as did 7 percent of resondents to our survey.” Why? What does this accomplish? Someone, please let me know.
Wheres the Methodology?
The last 4 of 10 pages in the story are a review of security suites, including both pay and a “free suite” which, as I have already explained, no longer exists. In any good review, at least one which makes quantitative claims such as “Performance” you should expect a statement describing the methodology. None is included in the CR review. I was told by a vendor who received the methodology after the fact that they had to sign a non-disclosure agreement in order to see it. The rest of us have to make do with the “Guide to the Ratings” statement below the ratings chart.
The actual ratings of security suites puts BitDefender on top, which is plausible on its face. But BitDefender is the only product to get an Excellent score for Antivirus Performance, which doesn’t seem right to me (scores are Excellent, Very Good, Good, Fair and Poor). Most of the other big-name products get Very Good. One would think that “Antivirus Performance” would reflect the quality of antivirus protection and perhaps something about system performance while the product is running, but what does CR mean by it? They say: “Antivirus performance mainly reflects detection of current and new viruses, with features, ease of use, and scan speed also considered.”
That’s a lot of important factors to lump into one score, especially since anyone who doesn’t read the Guide to the Ratings will assume it measures malware detection. For the most part the review gives only the barest details of the actual malware testing, although there is one point they describe how they take old malware and make small modifications in it and see if the mods are detected. This seems interesting to me, but only in an academic sense. A real test focuses on real malware. They say they do test some, but they don’t say how much and they don’t say how much each product detected. Here’s where I’ll jump in to Symantec’s defense, because I have seen many large, tests of large samples of real world malware and Symatnec’s engines are always among the very best. Did its “performance” suffer in CR’s review because of ease of use? Only CR knows for sure. Other performance ratings, for anti-spam for example, go similarly unexplained.
I asked Consumer Reports for a comment, although only just recently. If they decide to comment on this column I will add their comments to it and note the updates in a blog.
Congrats to BitDefender, generally recognized as good product, although my last personal experience was to lose patience with it over a false positive. It’s hard to test security products in a limited time period with a limited budget. Some people are better than others at it, some are much worse.
Consumer Reports responded after this story was published. This statement is from Jeff Fox, Technology editor, Consumer Reports:
“At Consumer Reports, we have always believed that scientific testing is the best way to evaluate products. We also use a statistically-valid survey methodology to measure consumer experiences. In preparing our September security reports, we employed both methods as we have for many decades. Some additional notes on this column:The story was not, as you state, “filled with data sourced to eMarketer.” That service provided just two pieces of data, namely the current number of Internet- and broadband-using U.S. HouseholdsUsing a separate credit card for online transactions avoids having to cancel your main card should fraud occur.We test software against modified versions of actual malware because such threats are what security software will often be called upon to recognize on the job.Finally, a note about your claim that Consumer Reports was invited to respond. Your e-mail to us requesting a comment was time-stamped on the same Saturday evening as your column is labeled as having posted. That left fewer than six hours to respond, on a weekend. It would have been helpful to have had more time.“
It’s true, as I said in the column, that I didn’t give them much time to respond. I hope I can make up for that some by putting this response out now and including it in the column itself.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.
For insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzer’s blog Cheap Hack.