The Anti-Malware Industry Tries to Save Itself

Testing anti-malware products is hard enough with the old model of detection. We need new tests for the new models that we all know must come.

The declining practicality of the anti-malware business is an old story; one day it's got to collapse under the weight of its technical model. Can it be saved in time?

During the week of Feb. 4, several vendors and testing organizations announced the formation of the AMTSO (Anti-Malware Testing Standards Organization), which is dedicated to developing common standards for testing such products.

It's true that testing standards are very important and a much more difficult problem than people not in the business would think. I've been involved in software testing for many years, and I've done some anti-malware testing. Like most testing, there are a number of variables you could adjust for.

In a paper presented (with Maik Morgenstern) last year at the AVAR Conference in Seoul, Andreas Marx of AV-Test discussed some of these testing variables for a proposal to test "dynamic detection." Consider some of these:

  • Should hardware be new, old or "typical"?
  • What operating system and version should be tested? Always the latest service pack? Always the latest patches?
  • Should products always be tested with default settings? Tuning up to "aggressive" would also be interesting.
  • How much should virtual machines be used?
  • Is removal important, or just detection?

Many of these variables impact on repeatability of the testing, usually a factor considered non-negotiable. But as a practical matter, both platforms and malware are changing so rapidly that a month from now you'll be busy testing new malware on new OS versions and won't have time to test on old software.

I could put up a good argument-one that I believe in-on either side of any of these questions, and there are plenty more where they came from. Try this on for size: How about judgment standards? Should the testing just put out numbers or state, as some AV testing organizations have done historically, that some results are satisfactory and others aren't. Is 90 percent detection acceptable? What about 99 percent? Or 99.99 percent?

With respect to dynamic detection, these really are fair questions. A 99.99 percent detection rate might be unacceptable, depending on your point of view. As Marx points out, the number of unique malware samples AV-Test received went from about 330,000 in 2005 to 872,000 in 2006 to almost 5.5 million in 2007-99.99 percent of that number leaves 54,900 samples undetected.

Testing static malware has been hard enough, but it's mostly an automation exercise, one at which Marx is a world-class master. A test recently compared 24 products on over a million malware samples. This was just to test scanning efficacy; there are so many other things you'd be interested in, such as the speed of testing and (a related factor) the background load it places on the system.

As the number of samples continues to escalate, the current approach of developing signatures and releasing them rapidly will have to fail. The only approach is, as Marx discusses in his paper, detecting malware more generally, looking for malicious behavior. It's hardly a new idea, it's just never been done well enough that you could rely on it, absent signatures. And it won't be good enough until it's that good.

How will we know when it's that good? We'll need good tests, so the AMTSO is a necessary step in the evolution of the industry. The harder steps come next.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.

For insights on security coverage around the Web, take a look at Security Center Editor Larry Seltzer's blog Cheap Hack