Failure to find all malware in the famous WildList can cause an anti-malware product to fail VB100 certification. Sometimes this is scandalous, as when Microsoft's OneCare failed WildList testing last year to widespread derision. But what does the WildList really prove?
In fact, insiders in the anti-virus industry, especially vendors, are widely derisive of the WildList, looking on it as an outdated burden on their development. The malware in it is outdated and not representative of the true threats facing users.
Click here to see the actual malware on the latest (April 2008) WildList. There is an extraordinary amount of malware that was making headlines in 2004, back in the heyday of the mail worm. There's W32/BugBear.A-mm from 2002. Go all the way down to the bottom of the list and you'll find W95/Spaces.1445 from 2000. Yes, that's one of two Windows 95 viruses on the list.
And there's a pattern in this list: it's all self-replicating malware, viruses and worms. Research has shown for years that self-replicating malware is not the way people get infected anymore. Virtually all malware of consequence in the wild are Trojan horses that get installed through social engineering. This includes the Storm worm, the only family of malware in the last few years that one could call an outbreak. From what I can see, no variant of it is in the WildList.
Of course, any anti-malware product worth its salt would be working hard to keep up with Storm, and not just by following the thousands of variants. By now all decent anti-malware products have some level of heuristic detection to look more generically for the major malware families.
Does the WildList give these products any credit for such work? Does the best heuristic product have an easier time with the WildList than the product which has no heuristic capabilities at all? Not a chance. But what if that most advanced product fails to detect W95/Dupator.1503, a Windows 95 virus? A black mark on their marketing which probably precludes them from certain bids. It's nuts.
Few if any major anti-virus products are being certified for anything other than Windows XP and Vista, and these Windows 95 viruses almost certainly don't run on those platforms. (According to Symantec, only the 9x kernel Windows versions are affected.) Yet AV vendors need to spend time and bloat up their products detecting old stuff like this.
There are good anti-malware testing labs out there putting products through more serious paces. I'm a fan of Andreas Marx and his AV-Test, who tests dozens of products with huge numbers of samples, sometimes over a million different samples.
It's not clear what a useful certification for anti-malware would look like. They can't be perfect; none of them are. Some products do clearly outperform the others in terms of detection, and AV-Test's numbers have born this out over the years. But performance, even anti-malware performance, isn't everything. Symantec has excellent detection, consistently near the top of the tests every time Andreas runs them, but that doesn't stop them from bloating up your system to the point that computing just isn't worth it any more. WebWasher is a multi-engine gateway product that is usually at the top of Andreas's lists, but it's so aggressive that it has a problem with false positives. Like I said, high detection rates don't excuse everything.
You could draw the conclusion that the anti-malware business is a big smelly mess and you'd be right. I never know anymore what to recommend to people when they ask me what AV to run; I hate them all. But I still run AV on all my production systems (currently we have BitDefender, Kaspersky, Avira, and Trend Micro here, but I switch frequently), and it's irresponsible not to. But how to judge them? I don't know, and a VB100 certification surely doesn't tell me.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.
For insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzer's blog Cheap Hack