Posted by David Harley on October 31, 2016.
SANS has been responsible, directly or indirectly, for some very substantial work in IT security, training, and so on. The Internet Storm Center, for example, and its GIAC certifications. As an authority on malware and, in particular, on anti-malware technology, however, its record is rather patchier. For instance, in 2006 Director of Research and President of the SANS Technology Institute Alan Paller claimed back in 2006, in defence of a poorly-implemented test of anti-virus products involving the creation of ‘new viruses’ that:
‘the leading AV companies … have traditionally not done well in finding and blocking new viruses quickly.’
To which I responded, somewhat peevishly, in the hallowed pages of Virus Bulletin* – not, to be honest, a publication Alan Paller pays much attention to, I suspect – that:
The model of adding definitions to detect each virus as you find out about it has a fatal flaw: it means that the anti-virus vendor is always ‘playing second’ to the malware author … But that is not what modern scanners do. At least, it’s not the only thing they do. They use a variety of proactive techniques, which means that they’re capable of detecting some unknown viruses as they appear, and before they’ve been analysed in the vendor’s lab.
(And that, by the way, was before I started to derive a substantial portion of my income from one of those AV companies.)
In fact, I was so irritated by Paller’s distorted view of history that I later pointed out in a chapter in The AVIEN Malware Defense Guide that:
‘Almost from the beginning, simple “signature” scanning has been bolstered with less specific technologies such as behaviour monitoring, integrity checking, and generic filtering, and more recently by rule-based heuristic analysis and scoring…’
Well, malware has moved on in the last ten years – for a start, viruses are just a small part of the range of malware attacks that we’re faced with nowadays. (Things were already moving that way ten years ago, of course.) And of course, anti-malware technology has evolved accordingly. However, critics of the mainstream industry continue to cling to the myth that what used to be called the anti-virus industry still relies on static signatures. And SANS, it appears, is still clinging to that same tired old fallacy. Perhaps someone there should be checking the Virus Bulletin site more often.
On November 3rd 2016 SANS is presenting a Webcast on Ready to Replace AV? Criteria to Evaluate NGAV Solutions. The announcement tells us:
‘Traditional AV, while it will always be part of the infrastructure, no longer works. It cannot stop “next-generation” attacks, such as ransomware and advanced phishing.’
Oh, really? Well, no, not really. At any rate, not unless ‘traditional AV’ means products that only detect viruses, and I don’t think there are many such scanners still around, even in the freeware sector.
I’ve long advocated that in general, computer users should be using multi-layered security suites rather than single-layer scanners. However, even stand-alone anti-malware scanners nowadays incorporate technologies that go far beyond simple (well, not always so simple!) detection of known malware to the recognition of malicious behaviour and characteristics in unfamiliar code.
Over the past few decades, the mainstream anti-malware industry has continuously adapted to meet the challenges of evolving malware technology. Mainstream products are as effective against threats such as ransomware – not quite the new kid on the block that SANS seems to think – and phishing malware as they are against other kinds of malicious code. That doesn’t, of course, mean they’re anything like 100% effective, but as I observed in another of my customary rants* against misleading advertising and misinformation: ‘I’m not holding my breath waiting for a one-size-fits-all, never-needs-updating, 100% effective, never-gets-in-the-way-of-a-legitimate-process solution.’
SANS, however, claims that such threats are ‘Next-generation’ and have to be met by ‘next-generation antivirus’. (An interestingly quaint mixture of the next-gen buzzword and a somewhat archaic product classification. But extraordinarily ill-informed.) Next-gen AV should, we are told:
…improve detection, prevent and block attacks, and reduce overhead and false positives through intelligence, behavior and pattern matching, rather than just relying on signatures.’
Since the fossils and dinosaurs of what next-gen vendors dismiss as ‘legacy AV’ have been doing all that for decades, I guess we’re all next-gen. And, in fact, I’ve argued before that the differences between ‘us’ and ‘them’ are mostly terminological. If there’s one thing ‘new generation’ marketing has learned from the old school vendor marketing departments, it’s the value of buzzwords like ‘AI’ and ‘machine learning’ (ML). But these concepts are not the exclusive property of the next-gen crowd: if there is a generational difference, it’s that the fossils don’t rely on a single algorithmic approach, any more than they rely on static signatures, whereas the new boys tend – in their marketing, at any rate – to promote an either/or view, claiming to be signatureless while promoting ML (for instance) as a technology so obviously perfect that malware just fades away when it turns up in the marketing brochure.
Time and again, next-gen companies have promoted the fallacy that mainstream anti-malware is purely reactive, but heuristics, behaviour analysis and blocking, sandboxing, integrity checking, emulation, and even machine learning have been used within the industry for a very long time: to ignore all these moves away from simple static signatures makes no more sense than talking about Microsoft Word as if it’s limited to the same functionality as a 1980s line editor.
SANS has also announced that it will share the criteria it has developed for evaluating next-gen tools. There’s certainly a gap there: next-gen vendors are notorious shy of being tested by third-party testing organizations (though several have joined AMTSO since VirusTotal put its foot down regarding test-shy participants). And one vociferous reseller of a next-gen product actually launched a site encouraging people to set up their own labs to test with samples obtained from that site. (What could go wrong? Well, I’ll go into that in another article.)
Perhaps SANS will come up with some useful ideas, though I don’t think either SANS or EICAR will put AMTSO out of business just yet when it comes to guidance on product testing. Still, I’m not so dazzled by the mainstream testers as to believe that there is no room for fresh thinking on the topic.
However, if you choose to evaluate its evaluation tools, it’s worth wondering whether those tools are based on a fundamental misunderstanding of first-gen/next-gen technology, and buying in to aggressive and misleading marketing.
*Quoted by kind permission of Virus Bulletin, which holds the copyright on the articles.
David HarleySubmitted in: David Harley |