Op-ed: Agencies are applying a 1 gigabit solution to a 10 gigabit security problem
Managing big data is key to identifying advanced threats in exploding network traffic.
In August, the National Institute of Standards and Technology published its final version of a step-by-step guide to help government agencies deal with computer security incidents.
The “NIST Computer Security Incident Handling Guide,” based on best practices from government, academic and business organizations, details instructions for incident response teams to create effective policies and plans for dynamically addressing cybersecurity threats originating from both inside and outside the organization.
The Guide rightfully recognizes the importance of rapid response in alerting other agencies of a security incident, and emphasizes the need for any plan to have a clearly defined mission statement, strategies and goals. These recommendations go hand-in-hand with efforts by agencies to identify and prevent attacks before they occur.
But also of significance in the NIST Guide is a recommendation to review each incident afterward to prepare for future attacks and to provide stronger protections of systems and data. This element is especially critical: The cause and source of up to 50 percent of known intrusions in the commercial sector remain unknown, according to 2011 data from the Identity Theft Research Center. This reinforces the need to conduct a thorough post-intrusion review -- an aspect of cyber response that is becoming harder to achieve as the volume of data packets flowing in and out of the agencies’ networks continues to rise.
Emerging tools to rapidly analyze massive volumes of data from disparate sources (i.e. big data) are playing a tangible role in helping intelligence and law enforcement agencies connect the dots to prevent attacks of both a physical and cyber nature. But there’s a hitch in the era of more sophisticated cyberattacks -- an unprecedented flow of network traffic that agencies must monitor.
Within this environment, there are a handful of technology and market forces impacting how agencies can respond to cyberattacks.
Exploding Network Traffic
Many agencies today are saddled with 1 gigabit monitoring solutions in a 10 gigabit world. Legacy solutions have been unable to keep up with exploding data packet traffic on today’s networks, to the point where only a fraction of network traffic can effectively be captured and analyzed after the fact to reconstruct an attack.
As large government agencies and commercial organizations deploy ever-faster networks, one of the greatest challenges they face is enabling their network and security monitoring infrastructure to keep up with the network itself. Legacy packet capture solutions with high rates of packet loss undermine the effectiveness of today’s powerful cybersecurity analysis tools, producing instead the cyber equivalent of a corrupted database. Today, cyber warriors require all the packets, all the flows, all the time.
The ability to manage big data is critical for effective incident response. Government agencies, in particular those involved with intelligence gathering, now expect to go back 30 to 60 days for full packet capture -- an enormous data storage requirement that means keeping 6-12 Petabytes of traffic history per 10 Gigabit-per-second (Gbps) link. This data volume can only be handled by storage solutions with massive scalability.
Incident Reconstruction
The importance of directing resources to technologies and manpower required to prevent cyberattacks from occurring cannot be understated. But there are limits to how far these efforts can go: A Bloomberg Government survey estimates companies would need to boost cybersecurity spending nine-fold to stop 95 percent of attacks (considered the highest attainable level).
The hard truth is that everyone understands intruders will get in. At some point, even the most impenetrable defenses will be penetrated. When an incident occurs, organizations must be able to answer several questions: How long has the attack been going on? Where else has this traffic pattern occurred on the network in the past month or year? What is the extent of the damage? How did the intruder get in?
Much as an investigator would seek to reconstruct a physical accident or crime, cybersecurity incidents must also be reconstructed. The capability to cost-effectively capture, store and query massive volumes of traffic history is critical to reconstructing cybersecurity incidents after the fact all the way down to the single data packet level (the DNA evidence).
Uncovering Cyberattacks
It is increasingly common for a cyberattack to lurk inside a network for months before being discovered. An example of this is an Advanced Persistent Threat (APT) attack, which places malware inside the perimeter defense of the network, where it can masquerade as an authorized user and hide itself while searching for desired data to steal.
In a recent cybersecurity survey of 100 IT executives responsible for security at companies with revenues exceeding $100 million earlier this year by CounterTack, 84 percent of respondents conceded some degree of vulnerability to APTs.
APT can be a loaded term, manipulated for a given firm’s product marketing purposes. NIST, in its FISMA guidelines, defines APT in terms of the attacker rather than the nature of the attack: “An adversary that possesses sophisticated levels of expertise and significant resources which allow it to create opportunities to achieve its objectives by using multiple attack vectors (e.g., cyber, physical, and deception).”
The volume of data traffic that can be recorded is a product of the underlying data storage space on the network that is available to the monitoring devices. Limited storage space severely restricts agencies when trying to decipher attacks that may have happened well in the past but only became apparent once the attacks start to exert damage. Thus, for agencies to recognize APTs and similar threats, they will require high-volume storage technology solutions that support high-performance file systems, bandwidth-intensive streaming applications, and transaction-intensive workloads.
The ability to reconstruct cyberattacks requires a tremendous amount of data and a framework to mine it. By going back, Agencies can in effect move forward to proactively defend against attacks of a similar or varying nature in the future.
Mark Weber is U.S. public sector president at NetApp and Tim Sullivan is president and CEO at nPulse Technologies.
NEXT STORY: How much spectrum does a 3-year-old need?