After I completed the testing and writing for PC Magazines recent Antispam product roundups, I decided it was time eat my own dog food, to use a phrase I believe was coined by Microsofts David Cutler. It refers to using the products you try to get your customers to use. Basically the business version of “practice what you preach.”
I might have implemented a server-based solution for my in-house needs, but I decided not to go quite that far—yet. Of the two personal antispam products I tested, Symantecs Norton AntiSpam 2004 clearly did better. One of my observations from recent testing was that the quality of spam filtering, as a general matter, seemed to have improved a great deal since the last large roundup of products I performed, also for PC Magazine, in late 2002.
At the time of that earlier review, I was still not moved enough by any product to actually run it myself. But about 10 months later my patience with spam was being rapidly exhausted, so I had a somewhat personal stake in the outcome of the review process.
The results were mixed. For example, the review numbers for McAfee SpamKiller were disappointing.
And then there was the follow-up to the SpamAssassin Saga. Those of you who follow PC Mags reviews closely might remember that our original Editors Choice last year was going to be DeerSofts SpamAssassin Pro. About a day or two before the absolute drop-dead deadline for the review, Network Associates bought DeerSoft, and we were able to get that fact into the Editors Choice box.
What we didnt find out till later, and what we were only able to add to the online copy, was that Network Associates decided to pull all the DeerSoft commercial products off the market and integrate the technology into McAfee SpamKiller and other Network Associates products.
So, SpamKiller 5 was the first product of this integration. But based on the results, the marriage doesnt seem to be a happy one.
Norton AntiSpam, on the other hand, was the second attempt for Symantec, following up its really lame first attempt as part of Norton Internet Security 2003. I thought the 1.5 percent false positives it generated was close-enough to good that I should give it a try.
A feature of Norton AntiSpam is its log of statistics. Here are my spam statistics since I began using the software on September 25, 2003, through Sunday November 9, 2003. Lets call that 44 days.
- E-mail scanned: 14,737 messages
- Average (over the 44 days): 335 per day
- Sent e-mail: 781 messages
- Valid e-mail: 6,023 messages(40.87%)
- Mail correctly identified: 5,996 messages (99.55%)
- >Spam: 8,714 messages (59.13%)
- Spam correctly identified: 8,103 messages (92.99%)
The most stunning number in this list is the sheer quantity of mail I receive. Something is clearly wrong with me—I must make a note to get myself an actual life (actually, a lot of it is security mailing lists that I dont read thoroughly). Maybe this weekend.
Still, it looks like I had 27 false positives (0.45% of valid mail), and that sounds like what I remember from my use of the product. NAS counts false positives when I manually scan the Spam folder in Outlook and mark non-spam messages with the “This is not Spam” button. Conversely, when I mark a message in the Inbox with the “This is Spam” button, it gets tracked as a false negative. The difference between the “Spam” and “Spam correctly identified” results totaled 611 messages or a hair over 7 percent of spam.
Now, Im pretty happy with the ability of the product to find spam and reaching 93 percent is pretty good. At the same time, my instincts are that the 0.45 percent figure for false positives seems like a small number.
But those 27 false positives over 43 days may be non-trivial. This figure tells me I still should check the Spam folder periodically, and even relatively often, because if I dont Ill be intimidated by the amount of mail in it.
I was also struck by the fact that the statistics page reported that the last Antispam update was released on 8/29/2003. If they can go a month and a half without an update (and yes, I do run LiveUpdate frequently), Symantec cant be following the spam business the way they follow the virus business.
Incidentally, the most successful technique Ive seen for fooling Norton AntiSpam is to make the sending address the same as the recipient address. Like most products, Norton has a white list that overrides their filtering rules, and that includes the users own address. Its not uncommon for users to cc: themselves on messages, so theres a real need for something to deal with this issue. In addition, I dont know why all spammers dont do this. It doesnt stop them from changing the From name on the message, as opposed to the address, giving the superficial appearance that the message is coming from someone else.
Meanwhile, the extra experience with the software lets me confirm my statement in the review that Norton AntiSpam is slow. Marking a message as “This is spam,” for example, takes far too long. Right now Im in the Log Viewer and its torturously slow. Speaking of the log, it doesnt seem to give much information. I went in there hoping to look up those 27 false positives, but I dont think it even has all of todays mail in it.
One more bit of perspective on the amount of spam I receive. Its actually a lot more than that 59 percent figure presented by Norton. Some of my e-mail accounts are already filtered at the servers. Note the difference in the handling of three addresses of mine that are filtered through FrontBridges server-based spam filtering. In the last month, that product found 523 spam messages and only one of them was a false positive.
Perhaps the answer is to switch to Outlook 2003. The numbers showed that it had not a single false positive, although it found far less spam. Oh well, the products get better, but the decisions we have to make continue to get harder.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.
More from Larry Seltzer