The Anti-Malware Certification Problem

 
 
By Larry Seltzer  |  Posted 2008-06-02 Print this article Print
 
 
 
 
 
 
 

A dirty little secret of the anti-malware industry is that the WildList doesn't prove a lot about a product.

Failure to find all malware in the famous WildList can cause an anti-malware product to fail VB100 certification. Sometimes this is scandalous, as when Microsoft's OneCare failed WildList testing last year to widespread derision. But what does the WildList really prove?

In fact, insiders in the anti-virus industry, especially vendors, are widely derisive of the WildList, looking on it as an outdated burden on their development. The malware in it is outdated and not representative of the true threats facing users.

Click here to see the actual malware on the latest (April 2008) WildList. There is an extraordinary amount of malware that was making headlines in 2004, back in the heyday of the mail worm. There's W32/BugBear.A-mm from 2002. Go all the way down to the bottom of the list and you'll find W95/Spaces.1445 from 2000. Yes, that's one of two Windows 95 viruses on the list.

Click Here to Watch the Latest eWEEK Newsbreak Video

And there's a pattern in this list: it's all self-replicating malware, viruses and worms. Research has shown for years that self-replicating malware is not the way people get infected anymore. Virtually all malware of consequence in the wild are Trojan horses that get installed through social engineering. This includes the Storm worm, the only family of malware in the last few years that one could call an outbreak. From what I can see, no variant of it is in the WildList.

Of course, any anti-malware product worth its salt would be working hard to keep up with Storm, and not just by following the thousands of variants. By now all decent anti-malware products have some level of heuristic detection to look more generically for the major malware families.

Does the WildList give these products any credit for such work? Does the best heuristic product have an easier time with the WildList than the product which has no heuristic capabilities at all? Not a chance. But what if that most advanced product fails to detect W95/Dupator.1503, a Windows 95 virus? A black mark on their marketing which probably precludes them from certain bids. It's nuts.

Few if any major anti-virus products are being certified for anything other than Windows XP and Vista, and these Windows 95 viruses almost certainly don't run on those platforms. (According to Symantec, only the 9x kernel Windows versions are affected.) Yet AV vendors need to spend time and bloat up their products detecting old stuff like this.

There are good anti-malware testing labs out there putting products through more serious paces. I'm a fan of Andreas Marx and his AV-Test, who tests dozens of products with huge numbers of samples, sometimes over a million different samples.

It's not clear what a useful certification for anti-malware would look like. They can't be perfect; none of them are. Some products do clearly outperform the others in terms of detection, and AV-Test's numbers have born this out over the years. But performance, even anti-malware performance, isn't everything. Symantec has excellent detection, consistently near the top of the tests every time Andreas runs them, but that doesn't stop them from bloating up your system to the point that computing just isn't worth it any more. WebWasher is a multi-engine gateway product that is usually at the top of Andreas's lists, but it's so aggressive that it has a problem with false positives. Like I said, high detection rates don't excuse everything.

You could draw the conclusion that the anti-malware business is a big smelly mess and you'd be right. I never know anymore what to recommend to people when they ask me what AV to run; I hate them all. But I still run AV on all my production systems (currently we have BitDefender, Kaspersky, Avira, and Trend Micro here, but I switch frequently), and it's irresponsible not to. But how to judge them? I don't know, and a VB100 certification surely doesn't tell me.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.

For insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzer's blog Cheap Hack

 
 
 
 
Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel