Bland Information Overload or Business-Critical Intelligence?

Nov 03, 2015
5 minutes
... views

Today there is much debate on the role of intelligence in cyber strategies.  Like so much in the world of cyber, as the requirements have changed, so have the capabilities being offered. With this in mind, what are the characteristics of modern cyber intelligence?

When the Heartbleed vulnerability was announced last year, like most, I started to search for more information. Within the first 24 hours, there were already hundreds of thousands of articles online, many providing technical insight on the exploit. There was no shortage of information about the vulnerability and how it worked, and I even found a video tutorial on how to leverage the exploit.

At a recent leadership event, I observed a very heated debate on intelligence. In many cases, the points made reinforced that intelligence is primarily a lot of noise with limited value.

There is indeed an abundance of technical information on how threats and vulnerabilities work, as well as known bad domains/IPs, and so on. It seems the challenge is that, with so much raw data, how do we ingest it to gain value?

E.O. Wilson, a social biologist suggested, “We are drowning in information, while starving for wisdom[1]”, so what is the difference? The FBI defines intelligence as “information that has been analysed and refined so that it is useful to policy makers in making decisions[2]”.

Depending which security vendor you follow, it is suggested that there were hundreds of millions of new malware iterations created in 2014. Looking at the CVE list[3], the count got to 9751 documented vulnerabilities and exposures identified over 2014. Taking that last number alone, that would be nearly 27 per day, which is too much data to be useful to make security decisions on a daily basis.

When thinking about these numbers, three points stand out as opportunities for intelligence to add value:

  1. Ensuring protection against as many of these threats as rapidly as possible.
  2. Being able to identify which are the high risk attacks that are likely to impact your business. From the total volume of attacks, we need to identify the few that we may choose to be incrementally proactive against. These would typically be those attacks that are targeting your industry or geography. By their nature these are the more targeted attacks that have specific, focused goals that can have greater business impact.
  3. Reverse analysis: recent public breaches seen in the media highlighted the challenge when indicators of compromise (IOCs) have been found but not acted upon. Being able to look up suspicious IOCs to understand if they correspond with existing campaigns or techniques is a growing requirement for security operations teams. ISACs and industry collaboration groups are springing up around the world to do this at a peer-to-peer level, but are limited by the groups’ membership.

From discussions that I’ve had, there are specific elements required to move information into business-valuable intelligence and to enable decisions to be made and actions to be taken. All of these are interdependent, and if any single one is missing, the value quickly collapses. These are the following:

  • Timely – Seemingly obvious, yet the reality is that, as attacks have become more bespoke and their lifespan has shortened, the time to receive actionable, contextual intelligence is critical.
  • Actionable – Intelligence is only useful if it includes information on what the recipient should do next (i.e. mitigate the attack). Too much threat information today simply describes the problem, requiring human intervention that makes the intelligence no longer timely.
  • Machine readable – Where attacks are constrained by only CPU power and network speed, providing intelligence that requires human inspection is inserting an analogue process into a digital problem. If we cannot directly apply actions at a technology level, without requiring human involvement to proxy the information, we add unsustainable lag into the process. This is critical both in terms of the time to apply preventative controls and in relation to the ability to deal with the capacity of today’s cyberattack scope.
  • Low false positives – If we are to apply intelligence without human input, we must have high levels of confidence in the information received.
  • Contextual – From a risk management perspective, this mean being able to identify relevant, current, high-risk threats that require context. Likewise doing the reverse lookup on indicators requires context to be able to qualify what the attack is and does.

Summary

Today’s cyber challenge is a numbers game. With the volume of what is happening globally and the volume of security events discovered internally, we are creating a big data challenge that will only expand as we add more IPs and more security capabilities, and the volume of attacks continues to grow. SIEM tools typically help consolidate internal events, but that is only part of the challenge. We also need to add context and consolidate external information.

What is key is that we typically have finite cybersecurity staff and live with the analogue limitation in the digital world. The more we can filter and automate activity (machine to machine), the closer we get to working at the same digital speed as the attack. There is always going to be a requirement for some level of human intervention, but today those humans are typically tied up with tasks that should be automated, so they are completed in a timely manner. Having the right intelligence is an enabler to increase automation and free up cybersecurity staff to focus on the activities they should be focused on.

[1] https://en.wikiquote.org/wiki/E._O._Wilson
[2] https://www.fbi.gov/about-us/intelligence/defined
[3] http://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=2014


Subscribe to the Blog!

Sign up to receive must-read articles, Playbooks of the Week, new feature announcements, and more.