Cybersecurity development has been a piecemeal process. It's time for a stronger, more cohesive approach.
If you look at how security has evolved over the past two decades, you'll see a co-evolution between point problems and point solutions. For example, twenty years ago network security was synonymous with firewalls.
But attackers soon found their way past the front door, which led to technologies like intrusion detection and prevention systems. Eventually, companies started rethinking the firewall itself and realized that it was not about exercising control at the network level, but being able to identify and control the applications that leveraged the network.
This realization gave rise to the next-generation firewall. Organizations started to realize that network security is not just about keeping bad stuff out, but also ensuring that good data stays in. The result was the development of mechanisms to monitor outbound network traffic for the presence of potentially sensitive data -- for example, data loss prevention technologies that can check for the possibility that credit card or Social Security numbers have leaked outside the enterprise perimeter.
But as threats persisted, organizations needed to continue monitoring their networks, gathering data for post-incident response. Network monitoring, forensics, and some advanced malware protection technologies were developed to address this need.
With all of these technologies in place, it made sense to step back and think about whether they could be fit into a cleaner framework. To this end, Neil MacDonald and Peter Firstbrook of Gartner proposed the Adaptive Security Architectures framework, which identifies four key functional areas to handle threats to enterprise information assets: prediction, prevention, detection, and response. These four areas attempt to cover the entirety of the threat lifecycle and can be applied to any phase of computing, whether in the legacy data center, private cloud, or public cloud infrastructure.
Prediction is about handling what happens before threats infiltrate your environment. It means being able to identify risks up front and either shore them up or accept them as areas of concern.
Prevention straddles the line between what happens before threats enter and what happens when threats enter. Traditionally, prevention has been about policy enforcement and access control.
Detection is about what happens during an active attack. It is about being able to identify threats -- and ideally, stopping them.
However, despite the industry's best efforts, motivated attackers can get past the before and during stages of an attack. Therefore, the after phase becomes more critical. This phase is about what you do when, not if, a threat gets through. It's handled through incident response capabilities, which involves continuous monitoring. If you can gather relevant data up front, you can sift through it after the fact to understand the scope, ramifications, and ultimately the root cause of any threats that compromised your organization. These insights can then be applied back to the prediction phase -- thereby coming full circle.
The ability to leverage the lessons learned in one phase to improve the effectiveness of another explains why this framework is adaptive.
The framework is also useful for determining how to allocate security investments because it helps ensure that the appropriate bases are covered. But let's take another step back and use the framework as a way to fundamentally rethink security. Rather than mapping existing technologies to each of the four functional areas, why not take each functional area as the starting point and develop a corresponding technology that relates to it?
This approach would not only help ensure that you are adequately covered, but it would also position your solutions for different functional areas to interoperate more effectively. You can leverage work done to cover one functional area to develop solutions for another. For example, each area is fundamentally about data science. In each case you gather data, process it, analyze it, and derive practicable insights from it. With that mindset, security fundamentally becomes a data science problem, and you can imagine a cadre of security solutions built on top of a common data science platform.
Organizations should start thinking along these lines to build a comprehensive security strategy. My hope is that the entire industry will adopt this approach when developing new technologies.
Cybersecurity is a highly dynamic field. As new technologies and buzzwords are introduced into the IT lexicon, we need to think about the implications they will have on information security. Using Gartner's Adaptive Security Architectures framework, we can develop better, faster, more efficient, and ultimately more intelligent tools to safeguard our most sensitive information assets.
Interested in shuttling workloads between public and private cloud? Better make sure it's worth doing, because hybrid means rethinking how you manage compliance, identity, connectivity, and more. Get the new issue of InformationWeek Tech Digest today. (Free registration required.)
Zulfikar Ramzan is CTO of Elastica, a firm taking a data science approach to cloud application security. Prior to joining Elastica, Zulfikar was chief scientist at Sourcefire (acquired by Cisco). Prior to joining Sourcefire, Zulfikar was technical director of Symantec's ... View Full Bio
Multicloud Infrastructure & Application ManagementEnterprise cloud adoption has evolved to the point where hybrid public/private cloud designs and use of multiple providers is common. Who among us has mastered provisioning resources in different clouds; allocating the right resources to each application; assigning applications to the "best" cloud provider based on performance or reliability requirements.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?