1. Computing

Testing Hocus Pocus

By August 17, 2006

Follow me on:

Consumer Reports is drawing accusations of misleading testing from the security industry, and I say rightfully so. Igor Muttik of McAfee AVERT makes excellent points in his blog post on the subject of creating viruses for the purpose of testing. Here's my own take on the Consumer Reports methods.
  • Consumer Reports claims they installed the tested antivirus products, granted them Internet access, and then "spent weeks closely monitoring each product and noted how early, if at all, the manufacturer equipped it to detect newly discovered viruses." Exactly how did Consumer Reports know when these allegedly new viruses first appeared? And without knowing when they appeared, how could they possibly know how early a vendor provided detection? There is a way to measure response times scientifically, as is done by AV-Test.org, but such tests must be run in a carefully controlled environment and compile data from a much more significant timeframe than 'weeks'.
  • Consumer Reports claims to have created 5,500 new virus variants (thanks guys, just what this world needs!) to test the products and further claims that their creations are "the kind you’d most likely encounter in real life." Let's hope not!
  • Consumer Reports claims F-Secure AntiSpyware is the top-rated spyware scanner and that Lavasoft's Ad-Aware is fifth. Yet F-Secure's antispyware scanner is Lavasoft Ad-Aware, so how one can be first and the other fifth is beyond comprehension. (My own tests have shown that both tend to detect approximately 65% of the adware and spyware in-the-wild.) Interestingly, one of the only products able to remove 98% of active processes associated with the 100+ live adware and spyware infections used in my tests has been McAfee AntiSpyware 2006 - a product Consumer Reports places at second to last on their list. Another stellar - and free - antispyware has been the consistent (and otherwise consistently top-rated) Windows Defender, which Consumer Reports places last. And Sunbelt's CounterSpy, another consistently top-rated scanner, also received low marks from the Consumer Reports lab.
  • Consumer Reports claims the tests were facilitated by Independent Security Evaluators (ISE) and that the president of that company, Avi Rubin, recused himself from involvement due to a conflict of interest. Perhaps that's so. Or perhaps Mr. Rubin was simply smart enough to know a bad idea when he saw one.

Admittedly, I may know very little about vacuum cleaners, cars, coffee pots, and many of the other things Consumer Reports tests - but I do know security software. The methods used, and the results construed from those methods, cause me to severely question the validity of any of their more mainstream reviews. I'm actually in the market for a new vacuum cleaner and a new coffee pot, and I'm sure of one thing - I won't be relying on Consumer Reports for buying advice.

Comments
August 19, 2006 at 8:26 am
(1) IBK says:

Why you do not refer to http://www.av-comparatives.org , which provides retrospective tests?

August 21, 2006 at 10:35 am
(2) tombot says:

Oh, poor, poor McAfee. They just can’t catch a break, can they? Get with the program, vendors. Manufacturers have been hating Consumer Reports for years. That’s why they’re a good magazine; if the companies liked CR, CR would be worthless.

August 22, 2006 at 1:03 am
(3) Rana N. Kabir says:

Consumer Reports is the worst fraud ever perpetuated on humanity. These guys couldn’t test ice cubes even if their life depended on it!

I have been an Acoustical Systems Engineer most of my life. I know a thing or two about sound reproduction. And for the last twenty years CR has consistently awarded absolute trash as good Audio gear, whereas the good sounding equipment getting low scores.

CR is a business model, in the business of making Money! They don’t have the expertise in-house to do real comparisons. That would require in-depth knowledge of the technologies involved as well as raw brainpower. Since when did a media Company have that?

August 23, 2006 at 2:59 pm
(4) jim says:

“Consumer Reports claims to have created 5,500 new virus variants (thanks guys, just what this world needs!) to test the products and further claims that their creations are “the kind you’d most likely encounter in real life.” Let’s hope not!”

You are wrong.
A testing group can handle a few thousand samples of computer viruses and malware in a lab appropriately for testing.
Tens of thousands of vx binaries and their source code are available every day to virus writers and the general public. Viruses and malware are freely available for any intent — malicious or academic research. They always will be, because we have something called the internet. CU most likely repacked and crypted a bunch of existing samples. Big deal.

Your bullet point makes no sense. Are you saying that CU harmed the world by making a few variants for testing and using them in a lab? Please let us know who was hurt, and you’ve made a worthwhile point.
It seems that the bullet point is more about casting FUD than making any concrete and valuable point about the testing.

I hope that more av testing groups perform better tests than the garbage that’s been performed in the past. And that these groups create and use samples of “the kind you’d most likely encounter in real life”.
The reason that 80% of all new malware gets past the major av vendors’ products today is because the av testing has been faulty in the past and the results misinforming.

Do you think that any of the 400,000 zombies that Jeanson James Ancheta infected with his bots were running a major AV vendor’s product? Of course they were. But the bots were made ‘undetectable’ from inadequate av solutions by him and his teenage buddies, most likely using common packing techniques similar to what CU performed when creating their testing variants.

It’s too bad that CU hasn’t released more details and data from the testing. For example, what malware families were used, what packers were used to create variants, how the variants were created, and actual results data. Then there may be a worthwhile discussion about these inadequate products’ performance during testing.

December 29, 2007 at 1:55 pm
(5) 3923ntt says:

Is the bottom line caveat emptor?

Leave a Comment

Line and paragraph breaks are automatic. Some HTML allowed: <a href="" title="">, <b>, <i>, <strike>

©2014 About.com. All rights reserved.