It seems round two of the Untangle Anti-Virus Fight Club has begun.
McAfee officials and others are taking issue with the methodology of a live test that pitted proprietary anti-virus products from vendors such as McAfee and SonicWALL against the open source project ClamAV at LinuxWorld. The test was the brainchild of Untangle Chief Technology Officer Dirk Morris, who has said ClamAV was not getting treated fairly by testers.
His answer was the AV Fight Club, which involved products from McAfee, SonicWALL, Kaspersky Lab, Symantec, Sophos, Fortinet, FRISK Software International, WatchGuard Technologies and HAURI. Untangle, which provides an open source network gateway platform, uses ClamAV for its anti-virus protection.
The test consisted of three sets of viruses. The first batch was a basic test set from eicar.org that Morris described in a blog as a universal virus test. The second set was the “in-the-wild” test of viruses picked from Morris mailbox that he had received over the years in mass quantities, and the third group of viruses was submitted by users.
Heres what the study found: only ClamAV, Norton Antivirus 2007 and Kasperskys offering caught 100 percent of the viruses in the first two categories, and they were the top three in overall percentage. Sophos, FRISK and McAfee ranged in the 80 to 90 percent catch rate for the first two groups and 85.7, 85.7 and 74.3 percent overall respectively.
To read more about Kaspersky Labs new anti-virus tool, click here.
HAURI and the gateway appliances from SonicWALL and Fortinet caught about 60 percent of the viruses in the first two groups, though Fortinet and HAURI only caught 45.7 percent of the viruses overall. SonicWALL fared slightly better with a 54.3 percent overall catch rate.
Coming in last was WatchGuard, catching 5.6 percent in the first two virus-sets and 2.9 overall – an interesting finding since WatchGuard uses ClamAV.
“This to me, and many others, clearly suggests a problem with configuration,” wrote security researcher David Harley in a white paper entitled “Untangling the Wheat from the Chaff in Comparative Anti-Virus Reviews”.
“The wide variation in detection rates compared to the comparatively narrow ranges in most professional tests may also reflect configurational inconsistency, as well as an unreliable sample set.”
Harley, head of a UK-based IT publishing firm called Small Blue-Green World, wrote that the testers seemed to fail to understand the importance of establishing a level playing field in terms of configuration, such as the level of heuristics set and archive scanning.
Page 2: Untangle Weaves Controversial Web With AV Test
Untangle Weaves Controversial Web
With AV Test”>
“By the testers own admission that the original setup had disadvantaged the Sophos product, it looks as though the products were tested pretty much “out of the box” without considering whether the conditions of the test would disadvantage specific default configurations,” he wrote.
Morris countered every effort was made to ensure fairness in that regard.
Click here to read about a booster shot for anti-virus.
“We tried our best to configure each product correctly, and in some cases spent great amounts of time doing so,” he said. “We were open to advice and help from the community, and we remain open to advice on configuration on the different solutions and will definitely take appropriate action if a misconfiguration is discovered.”
Hiep Dang, McAfees director of anti-malware research, also attacked the test and accused Untangle of having a conflict of interest.
“Their goal was to prove that open-source anti-virus solutions (in this case ClamAV) were just as effective, if not better than commercial anti-virus products,” he wrote. “It seems that they were highly motivated to prove this because evidently they use ClamAV in their gateway product.”
Dang criticized the small sample size used in the test – 35 samples of the hundreds of thousands of pieces of malware currently in the wild – and said McAfee ran its own scan on the exact same files and found it detected everything that was not a password-protected zip or 0-byte file.
Morris countered McAfees findings are not dissimilar from the results of the test.
“In our live test they missed Sample 012, a Trojan downloader, and Sample 016, a password encrypted zip, from the in-the-wild set,” he said. “The latter was distributed in an e-mail instructing the user to uncompress the zip and use a provided password. It is crucial for any solution to be used on the mail server or at the gateway to be able to catch this as a virus. The difference in Sample 012 could come from several places, like more recent signatures or a different version.”
As far as the sample size, Morris said the intent of this demo was to show the performance of the different engines on the viruses he has been exposed to through my inbox in mass quantities—with his inbox representing that of a typical user—or at a live customer site.
“None of the viruses came from me—I dont write viruses,” Morris said. “They did, however, come from infected machines all over the world into my e-mail honeypot and to our customers.”
Check out eWEEK.coms Security Center for the latest security news, reviews and analysis. And for insights on security coverage around the Web, take a look at eWEEKs Security Watch blog.