At the beginning of the month, a colleague I get along particularly well with brought me his home machine to ask me to take a look because it was infected by a virus. It turns out that the same colleague asked me for advice some months ago when he wanted to buy an anti-virus and I advised him to get Eset NOD32, which is the product I chose to run in our company. I am obviously well aware that anti-virus products aren’t bulletproof but it still makes you look bad to have someone report an infection while that someone used the product you advised them for protection. I plugged his machine to my isolated lab/test network and started to investigate...

The anti-virus was well active and its definitions were up-to-date, but sure enough Mark Russinovich-sama’s Autoruns revealed some strange entries to DLLs named dezifamu.dll or huyajuni.dll which are randomly generated names typically used by malware to make their automatic removal by security products or clean-up scripts more difficult. Before cleaning up this system and the dozen of entries of files and entries it created on this system, I was curious to know why my anti-virus of choice fared so bad to what turned out to be a Vundo variant and failed to detect any of its files.

I uploaded each of these files to the VirusTotal website (which is a web service scanning a specified file using almost 40 malware scanning engines from each major product on the market) and surprisingly, none of them detected the files as malicious code. In some way, I felt relieved, because my anti-virus of choice was not worse than any other in the market, but a couple of milliseconds later, I realized that it rather meant that all were equally bad and that NO anti-virus in the market could have stopped this threat, which is, I believe, their mission.

We now have to remind ourselves about the technologies behind most anti-virus products, and the most common one being signatures/definitions, which are kind of database files updated daily (and sometimes several times a day) and which contains detection signatures that are used by the anti-virus’ (either on-demand or pro-active) file scanning. While the technology behind these scanners is impressive and managing to scan for hundred of thousands of viruses on a whole hard disk in a matter of half an hour certainly is a feat and a clever use of advanced algorithms, the weak point of this method always was that it was not reactive enough to detect the newest threats (would miss malware for which it doesn’t have signatures) and that some people were bound to be infected before the anti-virus companies researchers could get their hands on some samples to dissect. Due to that, the anti-virus products were complemented with heuristics engines, able to analyze the program’s behavior, check for dangerous system calls or other sensible actions to try to identity unknown hostile code. The problem of this method obviously is that it tends to trigger a lot of false positives at their maximum level and that anti-virus developers have to find a compromise between security and a false-positive rate so high it would bother users a bit too much (and we already noticed thanks to UAC that they do not like that very much).

It seems clear to me that both of these approach are outdated: the signature one can not be reactive fast enough to guarantee that any “in the wild” malware will not intrude a system: as I said earlier, for the anti-virus researchers to receive the samples, some people are bound to be infected first, and sadly, even buying an expensive anti-virus product and subscription isn’t enough to guarantee the user’s safety in that case. There is also a lot of advanced code compression and cryptography mechanisms that can be used to turn a detected virus into an undetected one, as an original compression or cryptography algorithm applied on the malware can make the signature matching useless. It is also possible to slightly modify the virus’ code to make the detection signature used by the antivirus ineffective: this explains the number of variants of popular viruses and worms, that are often differentiated by antivirus companies by appending letters after their names: for example, Vundo.ABC. It is not rare to have 3 letters long “version” numbers, which, at 26 possibilities per letter indeed makes an impressive number of variants… too much for anti-virus products to detect them all in due time. Ironically, but quite common in the IT-security world where most security tools turn out to be double-edged swords, the same VirusTotal web-service that many people use to confirm doubts on strange files are often used by malware developers to ensure that their variations are different enough from the detected ones to pass the detection tests of the most used antivirus product world-wide.

The only matter here isn’t simply the detection rate, but also how much time it takes for anti-virus vendors to catch up. As a good citizen of the IT-world, I immediately submitted the samples I found to ESET, the company behind NOD32 along with some explanations and quick analysis I made on these files, expecting them to add them to their signature as soon as possible. As I was curious of how fast Eset and its competitors would react, I ran daily checks on the sample I sent using VirusTotal. The results here were disastrous, with NOD32 and many other big names of the anti-virus industry still not detecting some core-components of this virus more than 15 days later, even though I submitted the samples to both Eset and VirusTotal. I didn’t take any screen shots or URLs of the VirusTotal website when I first scanned, since I didn’t know yet I would write an article about that, but I found a URL of the check I ran roughly 3 days after I got the samples, which shows that even 3 days later, the malware was only detected by six of them. Today, it is found by 28 out of the 37 scanning engines used by VirusTotal, which means that still 9 anti-virus products do not detect it. While it isn’t surprising to see some barely known products missing it, it still is to see Kaspersky, a big name of the AV industry still miss this file after so much time elapsed.

 3rd December: only 6 anti-virus engines detected this file after roughly 3 days
3rd December: only 6 anti-virus engines detected this file after roughly 3 days

22 December: 28 anti-virus engines detected this file. Better, but still missed by some
22 December: 28 anti-virus engines detected this file. Better, but still missed by some

It also turns out that this virus, like many others, is able to regenerate itself completely it at least one of its components manages to run at least once, which makes detecting all of them an high-requirement for a complete protection and clean-up. Unfortunately for the consumers and IT professionals, given the complexity of the task, most anti-virus companies give up on detecting all the components and tend to only detect the dropper, the program deploying the payloads into the computer, and the most common viral modules, which may not be enough for a full disinfection for the reasons explained above. It is one of the reasons why my custom BartPE-based Boot CD includes the on-demand scanning components to not less of 5 different anti-virus products, in order to complement each other’s detection weaknesses.

Based on this evaluation of the current situation, I’m still amused by the terminology used by some independently-conducted anti-virus tests such as the VB100 one. It turns out that NOD32 managed to earn the VB100 award more than 50 consecutive times and is one of the few to have achieved such a feat. In addition to its small memory footprint and reasonable use of the computer resources in comparison to the competitors’ usually-bloated software (Norton/McAfee anyone?), the high-detection rate was definitely the most important feature that made me buy this product for my company, and this VB100 test is one of the only test regularly testing a large bench of anti-virus products. To earn this prestigious VB100 award, an anti-virus should, according to the official VB100 website “detect all In the Wild viruses during both on-demand and on-access scanning”. If we are talking about “in the Wild” viruses, NOD32 clearly failed here as it let an infection get through, that was, to me, “in the wild”, my definition of this term being for a particular virus to be able to get caught by users. The term turns out to be rather misleading, though, as it is actually a list compiled by a different organization: the WildList (which website design quite doesn’t inspire the seriousness it should for this kind of topic). This list is actually surprisingly rather small for a so-called “in the Wild virus” list and a term like “most encountered/widespread viruses” certainly would be more suitable. This test along with its surprisingly confusing terminology causes users to feel safer than they actually are. Users tend to believe that an anti-virus is like a force-field around their computer and that nothing can get through it: after all, they paid for it, so they expect for it to do its job, don’t they?