Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Spam Filters

Of the 16 vendors we invited, Cloudmark, Commtouch Software, MailFrontier, Red Earth Software, Sunbelt Software and Symantec delivered products that met our testing criteria. Unfortunately, our tests of Sunbelt's iHateSpam yielded unusable results; our Sunbelt cache file became corrupt during the test, which left its spam engines without the data needed to determine if a message was spam. Sunbelt support worked with us on the problem, something the company said it had not seen in the past, but we ran out of time and had to regretfully exclude iHateSpam from this review. Clearswift, GFI Software, Hexamail, Lyris Technologies, McAfee, NetIQ, SurfControl, Sybari Software, TrendMicro and TumbleWeed Communications declined our invitation.

Accuracy Remains King

We designed our accuracy test to assess antispam technologies out of the box, without reliance on customer tuning (see "How We Tested Spam Filters"). In fact, our methodology highlights the two most critical criteria for selecting a commercial antispam product: low administrative overhead and content-analysis accuracy. Unless you're a spammer, junk e-mail doesn't add a cent to your bottom line--it just eats into productivity--so limiting the cost of controlling it is important.

On the other hand, a misidentified customer e-mail could cost your company, so we weighted accuracy, at 30 percent, as the most important measurement of an antispam product's performance. It's important to understand our definition of accuracy--it differs from vendor marketecture that pegs accuracy based on how well engines and signatures identify spam messages, without reporting how often they classify legitimate mail as junk. We've found that spam engines are all too likely to identify as spam legit bulk e-mail, such as mailing-list traffic, newsletters and user-requested product-offer e-mail.

For our purposes, we called spam that made it to our inbox a false negative, while legitimate e-mail that was identified as spam and stopped was considered a false positive. Because false positives are much more detrimental to organizations, for scoring purposes we considered them five times more costly than false negatives (one false positive = five false negatives). We included our nonweighted numbers for comparison, but we stuck with the weighted information for scoring (see "Spam Filter Accuracy Results," page 60).

Vendors report accuracy numbers that can exceed 99 percent; some even tout a zero false-positive rate. Our testing shows these claims are mostly marketing hype based on results of "tests" with custom-tuned software--something we avoided to see how accurate these products were out of the box. Not all shops have the wherewithal to tweak spam rules before mail is tagged incorrectly. As always, caveat emptor; we recommend test-driving any product before entering into a software agreement.

  • 1