Cloud // Infrastructure as a Service
Commentary
8/19/2014
11:45 AM
Zulfikar Ramzan
Zulfikar Ramzan
Commentary
Connect Directly
Twitter
LinkedIn
RSS
E-Mail
50%
50%

Cybersecurity Demands New Framework

Cybersecurity development has been a piecemeal process. It's time for a stronger, more cohesive approach.

If you look at how security has evolved over the past two decades, you'll see a co-evolution between point problems and point solutions. For example, twenty years ago network security was synonymous with firewalls.

But attackers soon found their way past the front door, which led to technologies like intrusion detection and prevention systems. Eventually, companies started rethinking the firewall itself and realized that it was not about exercising control at the network level, but being able to identify and control the applications that leveraged the network.

This realization gave rise to the next-generation firewall. Organizations started to realize that network security is not just about keeping bad stuff out, but also ensuring that good data stays in. The result was the development of mechanisms to monitor outbound network traffic for the presence of potentially sensitive data -- for example, data loss prevention technologies that can check for the possibility that credit card or Social Security numbers have leaked outside the enterprise perimeter.

But as threats persisted, organizations needed to continue monitoring their networks, gathering data for post-incident response. Network monitoring, forensics, and some advanced malware protection technologies were developed to address this need.

[What is the "Goldilocks Zone" of security? Read Why 'Goldilocks Zone' Of Data Center Security Makes Sense.]

With all of these technologies in place, it made sense to step back and think about whether they could be fit into a cleaner framework. To this end, Neil MacDonald and Peter Firstbrook of Gartner proposed the Adaptive Security Architectures framework, which identifies four key functional areas to handle threats to enterprise information assets: prediction, prevention, detection, and response. These four areas attempt to cover the entirety of the threat lifecycle and can be applied to any phase of computing, whether in the legacy data center, private cloud, or public cloud infrastructure.

Prediction is about handling what happens before threats infiltrate your environment. It means being able to identify risks up front and either shore them up or accept them as areas of concern.

Prevention straddles the line between what happens before threats enter and what happens when threats enter. Traditionally, prevention has been about policy enforcement and access control.

Detection is about what happens during an active attack. It is about being able to identify threats -- and ideally, stopping them.

However, despite the industry's best efforts, motivated attackers can get past the before and during stages of an attack. Therefore, the after phase becomes more critical. This phase is about what you do when, not if, a threat gets through. It's handled through incident response capabilities, which involves continuous monitoring. If you can gather relevant data up front, you can sift through it after the fact to understand the scope, ramifications, and ultimately the root cause of any threats that compromised your organization. These insights can then be applied back to the prediction phase -- thereby coming full circle.

The ability to leverage the lessons learned in one phase to improve the effectiveness of another explains why this framework is adaptive.

The framework is also useful for determining how to allocate security investments because it helps ensure that the appropriate bases are covered. But let's take another step back and use the framework as a way to fundamentally rethink security. Rather than mapping existing technologies to each of the four functional areas, why not take each functional area as the starting point and develop a corresponding technology that relates to it?

This approach would not only help ensure that you are adequately covered, but it would also position your solutions for different functional areas to interoperate more effectively. You can leverage work done to cover one functional area to develop solutions for another. For example, each area is fundamentally about data science. In each case you gather data, process it, analyze it, and derive practicable insights from it. With that mindset, security fundamentally becomes a data science problem, and you can imagine a cadre of security solutions built on top of a common data science platform.

Organizations should start thinking along these lines to build a comprehensive security strategy. My hope is that the entire industry will adopt this approach when developing new technologies.

Cybersecurity is a highly dynamic field. As new technologies and buzzwords are introduced into the IT lexicon, we need to think about the implications they will have on information security. Using Gartner's Adaptive Security Architectures framework, we can develop better, faster, more efficient, and ultimately more intelligent tools to safeguard our most sensitive information assets.

Interested in shuttling workloads between public and private cloud? Better make sure it's worth doing, because hybrid means rethinking how you manage compliance, identity, connectivity, and more. Get the new issue of InformationWeek Tech Digest today. (Free registration required.)

Zulfikar Ramzan is CTO of Elastica, a firm taking a data science approach to cloud application security. Prior to joining Elastica, Zulfikar was chief scientist at Sourcefire (acquired by Cisco). Prior to joining Sourcefire, Zulfikar was technical director of Symantec's ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 2   >   >>
Broadway0474
50%
50%
Broadway0474,
User Rank: Ninja
8/25/2014 | 9:45:57 PM
Re: Security, a process on a timeline
Zulfikar, I really like the security camera analogy. That really let me wrap my head around it. Thanks!
Zulfikar_Ramzan
50%
50%
Zulfikar_Ramzan,
User Rank: Apprentice
8/22/2014 | 5:35:47 PM
Re: Security, a process on a timeline
Broadway0474, it can definitely be unsettling to acquiese that despite our best efforts threats will get through. The goal is not necessarily to passively accept it, but to do two things. First, put measures in place so we can understand what happened when something gets through. Second, armed with that knowledge, rethink our basic defenses. I liken it to a security camera. A building may have existing defenses in place: locks, burglar alarms, motion sensors, etc. A security camera cannot inherently prevent a break-in (except to the extent that it acts as a deterrent in the physical world). But with a security camera in place, one can review footage and quickly figure out what happened during a break in. Many organizations ignore this after phase. So, they are never able to respond to existing breaches (allowing those breaches to do far more damage than they would have otherwise been able to do). More so, they don't have insight into how to put up better defenses (allowing the bad guys to exploit some of the same holes over and over again). I hope this helps clarify my position.  
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
8/22/2014 | 5:05:43 PM
No, not defeatest
Broadway0474: Disagree on "defeatest." It would be defeatest if that were the only thing you were resolved to do. If, on the other hand, you have numerous defenses in place and an attacker still gets through, it's wise to analyze afterwards and see what could be done better. Much better to conduct forensics and try to prevent the next breakthrough than to do nothing because it might be labeled the wrong approach or even "defeatest," after the fact.
Broadway0474
50%
50%
Broadway0474,
User Rank: Ninja
8/21/2014 | 11:07:08 PM
Re: Security, a process on a timeline
I think I got it, Zulfikar. The "after" phase is more about having the systems in place to gather information and understand what went wrong. It still is a bit demoralizing to think about though --- defeatist even.
Zulfikar_Ramzan
50%
50%
Zulfikar_Ramzan,
User Rank: Apprentice
8/21/2014 | 6:16:58 PM
Re: What about the hypervisor?
Charlie, I think the hypervisor question is definitely relevant in the context of cloud. You will still want visibility at that level, but there may be different ways to achieve it. For example, VMWare's vSheid API allows you to achieve VM-level visibility from the host system itself (and many virtualization-friendly anti-malware technologies leverage this cabaility). The one nice aspect of virtualization is that being able to remediate threats is much simpler since you can revert back to a clean image or at least a clean snapshot. Of course, many caveats apply here since you might lose data, etc. There are also risks, certainly, of malware piercing a VM and compromising the host system (or of spreading among virtual instances in a given host). These situations don't occur often, but they are always a theoretical concern (especially in targeted corporate espionage type situations). 
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
8/20/2014 | 6:45:27 PM
What about the hypervisor?
Zulfikar, Do you think it's logical or practical to include more analysis of what's going on at the virtual machine hypevisor level. Is that a good inspection point? Or is it too late, if you spot trouble there?
Zulfikar_Ramzan
50%
50%
Zulfikar_Ramzan,
User Rank: Apprentice
8/20/2014 | 6:21:06 PM
Re: Security, a process on a timeline
One approach to take Charlie, if you are developing an organization's IT security strategy, would be to first inventory what technologies you have in place today and then map that to the framework -- being careful to identify all the assets that you are trying to protect as well. Then you can begin to see whether you have any capabiity gaps (or areas where you've overinvested in a particular capability). You can also start to see whether technologies that cover one part of the framework can potentially "play nice" with other technologies. 
Zulfikar_Ramzan
50%
50%
Zulfikar_Ramzan,
User Rank: Apprentice
8/20/2014 | 6:17:44 PM
Re: Security, a process on a timeline
Great question Broadway0474 -- definitely one that keeps me up at night! I think applying lessons is the only hope we have of keeping bad guys at bay. But aside from that, the reason for emphasizing the after phase is that in many cases you won't be able to prevent a breach from occuring -- but what you can do in that situation is have the plumbing in place to respond to the breach effectively. Far too often, organizations spend weeks to months investigating a single incident. The questions they have to deal with, however, are fairly simple to ask (but can be challenging to answer in the aftermath of a breach). Continuous monitoring offers a way to short-circuit that time. If we can reduce weeks to hours, then we achieve a lot.
Zulfikar_Ramzan
50%
50%
Zulfikar_Ramzan,
User Rank: Apprentice
8/20/2014 | 6:12:46 PM
Re: Security, a process on a timeline
Good point Stratustician. I think many people lose sight of the fact that we care first and foremost about protecting information from threats to that information. The choice of approach should come after. One area that I didn't get into in the article, but which I believe is important in this regard is having a corporate culture that places the appropriate amount of importance on information security. This can help make developing and implementing a security strategy much easier.
Zulfikar_Ramzan
50%
50%
Zulfikar_Ramzan,
User Rank: Apprentice
8/20/2014 | 6:08:25 PM
Re: Security, a process on a timeline
Thanks for your comment MDMConsult14! I definitely agree. I actually talked about this framework at a recent academic venue on security analytics to help promote the idea that we should be thinking about security solutions from the perspective of the entire threat spectrum. 
Page 1 / 2   >   >>
Multicloud Infrastructure & Application Management
Multicloud Infrastructure & Application Management
Enterprise cloud adoption has evolved to the point where hybrid public/private cloud designs and use of multiple providers is common. Who among us has mastered provisioning resources in different clouds; allocating the right resources to each application; assigning applications to the "best" cloud provider based on performance or reliability requirements.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Must Reads Oct. 21, 2014
InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
A roundup of the top stories and community news at InformationWeek.com.
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.