Google's Urs Hoelzle: Cloud Will Soon Be More Secure - InformationWeek
IoT
IoT
Software // Enterprise Applications
News
4/30/2015
12:57 PM
Connect Directly
Twitter
RSS
E-Mail
100%
0%
RELATED EVENTS
Moving UEBA Beyond the Ground Floor
Sep 20, 2017
This webinar will provide the details you need about UEBA so you can make the decisions on how bes ...Read More>>

Google's Urs Hoelzle: Cloud Will Soon Be More Secure

Google's chief data center architect, Urs Hoelzle, says cloud security will improve faster than enterprise security in the next few years.

Cloud Certifications To Boost Your IT Skills
Cloud Certifications To Boost Your IT Skills
(Click image for larger view and slideshow.)

Google has pioneered key features of cloud computing, including chiller-less data centers, broader use of Linux containers, and the big data system that was the forerunner of NoSQL systems. Far from resting on its laurels, Google's Urs Hoelzle, senior vice president of technical infrastructure, said, "All the innovations that have happened so far [are] just a start."

Hoelzle made that pronouncement during the morning keynote address to Interop attendees at Mandalay Bay in Las Vegas on Wednesday, April 29.

And two areas that will show the greatest innovation over the next five years will be in cloud security and container use.

Cloud security will soon be recognized as better than enterprise data security because the cloud, by design, "is a more homogenous environment," he said. That means IT security experts are trying to protect one type of system, replicated hundreds or thousands of times, as opposed to a variety of systems in a variety of states of update and configuration.

Google's Urs Hoelzle

Google's Urs Hoelzle

In contrast, where one complex system has many different types of interactions with another complex system "little holes appear" that are hard for security experts to anticipate in every case.

Hoelzle said that the use of encryption on-the-fly and of scanning systems trained to look for threats and intruders is already in place, and will be extended over the next few years in Google's cloud operations.

In an interview afterward, he said the mapping of systems -- so that a cloud data center security system knows which application talks to which application, what policies are governing, who can access what data, etc. -- will give security experts an auditable tool with which to maintain security in depth. "You only have to get it right once and it's right every time," Hoelzle observed.

[Want to learn more about the Google Cloud Platform? See Google Turns Up The Heat On Amazon.]

In addition, for cloud users, the software changes in cloud systems occur behind APIs, so there's no fresh software at the surface in which an attacker may detect a vulnerability and exploit it. "There's no mistake on installation," that a hacker can see when the software sits behind an API, Hoelzle said.

"We run a large cloud that gets attacked every day," he said. After 15 years in which the company has

Continued on next page.

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio

Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 2   >   >>
MetricStream
50%
50%
MetricStream,
User Rank: Apprentice
5/26/2015 | 10:27:41 AM
Vibhav Agarwal, Senior Manager of Product Marketing at MetricStream
One of the most interesting points that stands out to me is the fact that Mr. Hoelzle says that soon enterprise systems may more easily be in compliance with data regulations if they're running on the cloud. We know that transitioning to the cloud is becoming the new normal. Instead of enterprises continuously trying to re-build their Governance, Risk and Compliance (GRC) programs to keep pace with updated regulations and emerging cloud service models and technologies, they should adopt a layering approach, which allows organizations to create a single source of GRC truth across the enterprise. 
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
5/5/2015 | 9:22:28 PM
A uniform environment easier to protect
GAProgrammer, You have a point that hackers have only one environment to decipher, but I think that the notion that a confusing environment is also a protected environment has been discredited. Better to know what you've got and take the best measures to shield it than to have intruders slipping in through the side door. 
GAProgrammer
50%
50%
GAProgrammer,
User Rank: Ninja
5/4/2015 | 10:25:32 AM
The homogeneous nature is also the Achille's heel
as in, hackers only have one target to focus on. Once they break that, they have the keys to the entire kingdom. Except now, they don't have control of one system(instance) - they now have access to the computing power of hundreds or thousands of instances. Imagine the havoc THAT could create.
Li Tan
50%
50%
Li Tan,
User Rank: Ninja
5/2/2015 | 12:55:47 AM
Re: A leadng Google thinker, writer
You should never trust any disclaimer that something is secure - as long as it's a service offered to public or a group of people, it cannot be really secured all the time.
laredoflash
50%
50%
laredoflash,
User Rank: Strategist
5/1/2015 | 9:06:38 PM
Re: A leadng Google thinker, writer
The cloud will never be secure. Got that. The cloud is equal to another Wall Street meltdown. There is no way to secure something that everyone can see. Every couple of years we go through the same old BS. Wall Street with their melt down, and IT with their black hole. No more lies please. Master Rod
laredoflash
100%
0%
laredoflash,
User Rank: Strategist
5/1/2015 | 9:06:17 PM
Re: A leadng Google thinker, writer
The cloud will never be secure. Got that. The cloud is equal to another Wall Street meltdown. There is no way to secure something that everyone can see. Every couple of years we go through the same old BS. Wall Street with their melt down, and IT with their black hole. No more lies please. Master Rod
Christian Bryant
50%
50%
Christian Bryant,
User Rank: Strategist
5/1/2015 | 10:29:44 AM
Re: What if Target had been on Amazon?
I know, it's hard to argue alternatives when of your two choices one has some inherent architecture that allows for such detection (cloud) and the other must be purposefully configured to monitor for events like the Target hack.

But we must have alternatives, I believe, as businesses.  Perhaps it is too early to go there, and businesses need to be completely responsible in implementing whatever that alternative is for industry to accept alternatives to the cloud as a sign of security compliance.

Yes, you're right - here and now, had Target been on a cloud service, this most likely would have ended much better than it did.  
nasimson
50%
50%
nasimson,
User Rank: Ninja
4/30/2015 | 10:57:15 PM
an aspect that is perceived the weakest.
Urs has been a thought leader and a practitoner at the cutting edge right from 2004 when he did his PhD from Stanford. So its more than a decade of expertise.

What a confidence he's insipiring in the cloud security - an aspect that is perceived the weakest.
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
4/30/2015 | 5:51:54 PM
What if Target had been on Amazon?
Christian, Your points are well taken and things will play out as you describe in many enterprise data centers. But take Target, for example. Target doesn't believe in using the Amazon cloud, I just heard in the session that I"m attending at Cloud Connect/Interop. But what if Target by some stetch of the imagination had been operating on Amazon infrastructure. If it were, I suspect, someone there would have noticed the unusual pattern of milions of credit card numbers being streamed out the door to an address in Russia.
Christian Bryant
50%
50%
Christian Bryant,
User Rank: Strategist
4/30/2015 | 3:55:57 PM
The Cloud - All It's Cracked Up To Be?
I definitely agree that "cloud" security is moving at a fast pace, perhaps since that supports the brunt of user activity these days, whether it is collaborative development teams interacting with their source version control repositories or mobile apps.  However, we have to be careful not to think there is only one technology platform on which to do cloud computing in order to maintain that pace - security should cross architecture boundaries as easily as the cyber criminals that attach them do.  GNU/Linux, NoSQL and even the hardware running cloud computing systems could all exit stage left when the next big thing comes along.  True, containers are hot - smoking hot - and fun to build and deploy.  But should container use be the only game in town toward entering the cloud?  

The point of securing one type of system and then replicating that secured type hundreds or thousands of times sounds appealing at first.  You could argue that that represents a "better" type of security than current Enterprise data security.  But taking that to the next logical step, someone out there _is_ going to find the Achilles heel in container architecture - perhaps coming from an angle not previously considered, as the cloud as an overall ecosystem is more complex than just plugging in a secure container - and then you will effectively have done a great job lining up the dominoes.  I don't know that it makes sense to do a one-to-one comparison between "the cloud" as a single complex system and to any one of thousands of single complex systems in the form of various Enterprise servers in that ecosystem.  If only from having been alive a while, I believe these "little holes" can appear just as easily in the cloud as on any given complex system. 

All that said (and apologies for sounding critical of Hoelzle - I'm a mere college dropout whose focus since the 90s has almost entirely been on GNU/Linux builds and Emacs Lisp deployments), once you start talking about "encryption-on-the-fly" and "scanning systems" you are now getting my head to nod.  Data and the network on which it is transported is everything.  The function of mapping your ecosystem (I assume within a simulation - complex ecosystems should never simply be documented without a live simulation that can be updated with real-time changes pulled from results of scanning) should be project "day 1" discussion material and always be developed through to Production Go Live.       

In terms of exploits, to suggest that software changes that occur behind APIs must be invisible to cyber criminals is a tad optimistic, however.  Using Google stats to support this idea and the overall security of the Google architecture both 1) doesn't help other companies who are developing their own cloud ecosystem and may want to divert entirely for Google-style architecture, and also begs the question of what incidents have already occurred, what exploits may exist that have not yet made their way to the boards, that we aren't currently aware of?  How many technologies have presented with similar assurances and have later been opened up and shown to have some vulnerability? 

As I said, I'm nobody - a worker bee in the bowels of a digital domain.  But I know better than to ever take anything I put together for granted, based upon it's architecture or that of the system and network it sits on.  There is always a hole and there is always someone who can find it, exploit it and brings decades of work to the ground.  To suggest a sensitive enterprise system can gain "compliance" simply by being migrated to the cloud is, well, reckless.  No, I understand that this is all in the context of Google and that, whomever this auditor is referenced in the keynote, was probably doing the simple math that formal auditing standards have today, but that just doesn't sound like good security to me. 

I will say that I agree that the end user should have the same transparent experience regardless how the application they access is served up.  Or from where.  And the "where" is why I feel slightly nervous in reading this - the network is everything.  The age of the datacenter is slowly coming to a close, and distributed computing - the natural successor to the datacenter - is rapidly growing across the globe.  Security must never be seen as a one-time step, especially at the application level, and the network should also always be seen as a living thing, changing every second and at risk every second. 

On another note, Urs Hölzle has talked at length about the adoption of open-source infrastructure technology OpenFlow in an article from Wired [0] - VERY, VERY COOL material and ideas. 

[0] Wired, April 2012, Going With the Flow: Google's Secret Switch to the Next Wave of Networking by Steven Levy
Page 1 / 2   >   >>
[Interop ITX 2017] State Of DevOps Report
[Interop ITX 2017] State Of DevOps Report
The DevOps movement brings application development and infrastructure operations together to increase efficiency and deploy applications more quickly. But embracing DevOps means making significant cultural, organizational, and technological changes. This research report will examine how and why IT organizations are adopting DevOps methodologies, the effects on their staff and processes, and the tools they are utilizing for the best results.
Register for InformationWeek Newsletters
White Papers
Current Issue
IT Strategies to Conquer the Cloud
Chances are your organization is adopting cloud computing in one way or another -- or in multiple ways. Understanding the skills you need and how cloud affects IT operations and networking will help you adapt.
Video
Slideshows
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll