AV-Comparatives is one of the most respected antivirus comparison tests in the world and is performed twice a year. The latest test results were published over the weekend and we did very, very well. Although they never name an overall winner, I think if they did do so, we would have been the overall winner. At a minimum we handily beat virtually all of the premium products. This time around 16 products were tested including Symantec/Norton, McAfee, AVG, Avira, and Kaspersky.
AV-Comparatives is an interesting test because it measures all three elements a user should be looking at to choose the right AntiVirus product—the ability to detect malware; the ability to not falsely detect malware; and speed. So, the best product is not necessarily the one with the best detection—it is the one with the best blend of these three elements. These elements can be mutually exclusive and it is very hard to get top scores in each category. Products that detect a lot of malware can over-detect and have a lot of false positives which severely hinders users. A low number of false positives can imply a poor ability to detect malware. And it is very easy to do nothing very fast but increasing speed tends to lower the ability to detect malware.
AV-Comparatives assigns each product one of four scores: A+, A, Standard, and Tested (a nice euphemism for failed). To get an A+ one has to have 97% detection and fewer than 15 false positives.
This time around, we were the only AV product to score in the top-5 in each category (and no, that does not mean we were 5th in each category). We were #5 in Detection, #2 in False Positives, and #1 in Speed.
- Detect Malware. AV-Comparatives runs about 4 million malware samples through the products. One set of 2.5 million is old and they expect all products to detect them. Then they run a set of 1.5 million newer samples. To get a top score one must detect at least 97% of this second set. We scored a 98% for the #5 ranking. But, two of the higher detectors (Avira and McAfee) were downgraded for excessive False Positives. The top performer in this category was GData—a product that uses two AV engines (one of which is Avast) so it can maximize detections but at the expense of speed, false positives, and footprint. The only single engine A+ rated product with a higher detection than us was Norton and it was only slightly higher (98.3%). Many well known products such as Microsoft, AVG and Kaspersky failed to make the 97% threshold.
- Not detect false positives. If a product finds more than 15 false positives in the clean set, its score is lowered. This happened to us 6 months ago when we had too many false positives and we got downgraded from an A+ to an A. We spent a lot of time in the recent months upgrading our ability to not detect false positives—we added over 1TB and 150,000 files to our known clean set. The result was fantastic as we had only 5 false positives for the 2nd position and we were only 1 false positive behind the category leader. Some of the big names had tremendous problems this time around. McAfee with their "in the cloud" detection had 41 False Positives. Avira was downgraded to an A rating because of 21 false positives. Symantec barely made the A+ cutoff with 13 false positives.
- Process quickly (i.e. speed). We are usually in the top 1/3rd but we have been spending time optimizing our product and this time around we got the #1 position. Slightly behind us was Norton who has been heavily advertising the "Need for Speed". We agree. Customers need speed—a fast product with high detection and few false positives. And that is Avast! Most competitors had speeds half of ours and some (such as Microsoft) were about three times slower.
So, were we the best overall? I think so. Check out the results for yourselves and make your own decision: www.av-comparatives.org. To find the report, click on the "On Demand Comparative August 2009" link near the top right of the page.