Government Cloud Use Hits Inflection Point

New standards, security, and architectures mean the CloudFirst stars are finally coming into alignment.

Michael Biddick, CEO, Fusion PPT

May 30, 2014

5 Min Read

Download InformationWeek Government's June Tech Digest, distributed in an all-digital format (registration required).

Sometimes we're our own worst enemies. When White House officials announced the Cloud First mandate in 2010, it created big expectations. Cloud would help rein in the $80 billion (and growing) federal IT budget while delivering efficiency and reuse and off-loading repetitive tasks from federal staff. Cloud vendors leaped into action, spending millions developing new offerings.

Only a few agencies grabbed the ball and ran. The General Services Administration is a prime example, having moved email and other apps to the cloud, and it has reaped rewards. The National Oceanic and Atmospheric Administration, the Department of Agriculture, and most recently the Interior Department also have forged ahead to the cloud, especially for public-facing websites and data. The Federal Energy Regulatory Commission is modernizing a decade-old eLibrary application by moving it to the cloud.

But too many federal IT teams resisted and, in the process, shot themselves in the foot as budgets got tighter and working conditions more strained. We're not surprised that about half of the 532 federal government IT professionals responding to InformationWeek's 2014 US IT Salary Survey are looking for new jobs in the wake of a three-year salary freeze. Maybe if cloud adoption hadn't been confined to the low-hanging fruit of email and similarly easy-to-convert systems, CIOs could have freed up money to invest in human capital.

It's not all the agency leaders' fault. They've been handicapped by slow progress in acquisition reform, thorny legal issues surrounding data ownership, and privacy concerns. As happens so often in Washington, a daunting list of policy challenges, armies of lawyers, and stifling bureaucracy beat sparks of progress into submission. 

However, while cloud standards are by no means complete, there's enough progress that this is a great time to start or revive a cloud project. As they do so, agency leaders must make sure they address the biggest problems to cloud implementation: a jumbled vendor landscape, the reality that ''hybrid'' clouds might just create new information silos, and the fear of getting locked in or surrendering control.

Crowded vendor landscape
The federal cloud computing market is fiercely competitive, with significant IT infrastructure capacity chasing relatively few federal buyers. We see two primary types of cloud vendors: call them contemporary cloud providers and traditional government contractors.

In the contemporary cloud provider category, think Amazon, Microsoft, and Google. They emerged out of the commercial market and still mainly target the private sector, but they're retrofitting their environments to meet federal security frameworks. Their strengths tend to be granular pay-for-use, immediate availability of resources, and rock-bottom commodity pricing. It's easy to grab a credit card, sign up for a service, and go. Unfortunately, this pay-and-play experience doesn't transfer to the federal market because of governance, acquisition, and security requirements. 

The second group, which emerged from the traditional federal-managed service-provider pool, includes providers that typically custom build systems against detailed specs with strict physical and logical security zones. Lockheed Martin, CGI, American Systems, Hewlett-Packard, and IBM fall into this group; they might offer something as a cloud service that incorporates only some aspects of cloud computing, such as virtualization. You're not getting many pricing wars. They do, however, understand their customer base. These providers are working to make their features more similar to commercial contemporary cloud environments in an effort to be more competitive.

Amazon Web Services is often held up as the gold standard for infrastructure-as-a-service features and capabilities. Microsoft and Google are fiercely battling for the office-automation software-as-a-service market. Meanwhile, traditional hardware vendors including NetApp, Dell, EMC, IBM, and HP are advocating that agencies simply modernize their existing data centers, making them more ''cloud-like,'' and avoid the security and control concerns surrounding public cloud altogether.

And then there's a third category: the government provider. Agencies such as the Defense Information Systems Agency, the Navy's SPAWAR Command, and the Treasury and Health and Human Services are making their cloud services available to other departments and might strike interagency agreements to bypass complex procurement challenges. Most haven't figured out how to bill based on usage, but that will come in time. Sharing is the future -- the Office of Management and Budget's Digital Government Strategy demands that agencies function more like data service providers.

But most of the 155 federal government technology professionals responding to our InformationWeek 2013 Federal Government IT Priorities Survey didn't get the memo. Cyber security and disaster recovery lead the list of 32 priorities, interagency collaboration lands in 13th place, and cloud is way down at No. 21.

Federal IT managers have to forge smart partnerships, and there are no easy answers. Each agency and department needs to develop a strategy based on its applications, data, and infrastructure requirements. Generally, for commodity IT resources and applications, the public cloud makes sense. For sensitive and data-driven applications, agencies might need to use on-premise data centers, but make sure they're highly efficient and optimized to take advantage of larger pools of IT resources.

To read the rest of this story,
download InformationWeek Government's June Tech Digest,
distributed in an all-digital format (registration required). 

About the Author(s)

Michael Biddick

CEO, Fusion PPT

As CEO of Fusion PPT, Michael Biddick is responsible for overall quality and innovation. Over the past 15 years, Michael has worked with hundreds of government and international commercial organizations, leveraging his unique blend of deep technology experience coupled with business and information management acumen to help clients reduce costs, increase transparency and speed efficient decision making while maintaining quality. Prior to joining Fusion PPT, Michael spent 10 years with a boutique-consulting firm and Booz Allen Hamilton, developing enterprise management solutions. He previously served on the academic staff of the University of Wisconsin Law School as the Director of Information Technology. Michael earned a Master's of Science from Johns Hopkins University and a dual Bachelor's degree in Political Science and History from the University of Wisconsin-Madison. Michael is also a contributing editor at InformationWeek Magazine and Network Computing Magazine and has published over 50 recent articles on Cloud Computing, Federal CIO Strategy, PMOs and Application Performance Optimization. He holds multiple vendor technical certifications and is a certified ITIL v3 Expert.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights