McAfee Putting Malware Signatures in the Cloud

By Larry Seltzer  |  Posted 2008-09-08 Print this article Print

Will McAfee's Artemis and similar technologies really improve security for end users?

McAfee has made no secret of its Artemis project. It's been in beta most of this year. Now the company is talking it up to the Wall Street Journal, making it sound like it's closer to product form. Click here for a pretentious press release on the technology.

The idea behind Artemis, as the WSJ article says, is being pursued by McAfee's major competitors as well. Trend Micro has already made vague announcements of cloud-based service, and I suspect we will have similar announcements very, very soon. Essentially the idea is to offload some malware checks to an online database. When the software detects a program or file as being suspicious, probably through behavior checks, it takes some form of hash of the files involved and submits it in a database query to their most updated malware database. If a hit is detected then the user can be notified, and perhaps the malware removed.

It doesn't sound very revolutionary, does it? Why not just push updates down faster? Kaspersky claims to do this every hour.

Feeding all those updates out to users is a huge, complicated and expensive task. It's also probably true that most of the updates are "wasted" in the sense that very few users actually get hits on them on their machines. How much more efficient if they could check online without having to distribute the updates?

Which is the more efficient model is a fair question, and one that's better answered through experimentation than through speculation, and McAfee appears to have experimented through the spring and summer. It's not surprising that the WSJ story is lacking in technical detail, especially of the fact that McAfee's malware detection has done badly in the last year. The latest numbers I have from AV-Test look especially bad.

A report from AV-Comparatives many months ago of McAfee performance with and without Artemis also makes the pre-Artemis performance look bad, but with Artemis the numbers shoot up to more than 99 percent detection. The downside in AV-Comparatives' tests is a high level of false positives. Because the signatures are so new they have even less exposure to real-world testing. McAfee tries to combat this with whitelists, but there are limits to how effective that can be.

How well can this work? In theory it can bring a real improvement; the update pipeline does slow down the process, especially for users who are not constantly or reliably connected. Other factors like proxy servers can make a difference, too. Of course the user's system performance is a factor, too.

There are many variables affecting efficacy: first on my list is how does McAfee determine what is a suspicious program, worthy of sending out a database query? This is a huge question because I think the company has to try to be conservative about it. What McAfee CEO Dave DeWalt calls a 100 millisecond process (in the WSJ article) doesn't seem like much, but if you do a lot of that with a high-latency connection, the user is going to notice. Pretty soon we'll be seeing tips online to improve system performance by turning off Artemis with a registry hack.

The whole issue of Internet connections is huge in this. If everyone had fiber to the premises (like-ahem!-I do) then it would be a cheap feature to add. But there are still a lot of dial-up users and even a lot of people with broadband have relatively slow, high-latency connections, especially on the upstream. Artemis and the like represent a bet on faster connections. Perhaps the vendors test the connection and don't even turn it on unless it meets the right performance criteria.

There are other benefits and detriments to various parties. Vendors get the added benefit of fast feedback on what malware users have, as well as suspicious-looking files that turn out not to be malware, so their own internal testing may improve. Testing for comparison by third parties, however, becomes much harder: Usually such tests are performed on isolated networks with product updates frozen to a specific time, so the comparison is even.

But with Artemis, effectively comparison tests have to be run in parallel, or one vendor will have more time to update their database than the others. I asked Andreas Marx of AV-Test, a serious expert at AV testing methodology (he has a Ph.D in it) for perspective on this. He feels that their labs have the capacity to run full tests either in parallel or close enough that it doesn't matter. Less sophisticated test labs may have more of a problem.

In the end, Artemis is a marginal advance. It promises more up-to-date detection, but it's probably inevitable that this will bring with it higher rates of false positives. I don't like the sound of that trade-off. Vendors really have to keep the false positives down, even if it diminishes the effectiveness of detection.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.

For insights on security coverage around the Web, take a look at Security Center Editor Larry Seltzer's blog Cheap Hack.

Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel