Government Technologist: Vivek Kundra's Data Center Problem
The number of data centers belonging to the U.S. government has more than doubled in the past 10 years. Can Federal CIO Vivek Kundra put a stop to the trend?
The number of data centers belonging to the U.S. government has more than doubled in the past 10 years, an expensive and wasteful trend that Federal CIO Vivek Kundra says he wants to stop. At the same time, however, yet another massive, government-owned data center is going up in Utah.
Kundra's tenure as the United States' first CIO will be judged in part by his success in reversing course. The feds need fewer data centers, not more. Instead of spending billions of taxpayer dollars on huge capital projects, the feds need to turn to virtualization, server consolidation, cloud computing, and shared services, and consider outsourcing to technology vendors that specialize in data center operations.
Kundra has identified the government's love affair with mega IT projects as an outdated approach to today's data management challenges. The government now has 1,200 data centers, compared with 498 a decade ago. "We need to get away from this model of investing heavily in infrastructure," Kundra said at the InformationWeek 500 conference in September, a quote I've used before, but which bears repeating.
The juxtaposition between what needs to happen and what's actually going on came into sharp focus recently when Kundra called the government's data center proliferation "troubling," according to Federal News Radio. "It shows greater investment in infrastructure, yet it's more fragmented," he said.
As Kundra spoke, planning was underway at the National Security Agency for one of the biggest data centers on the planet. The construction project, 26 miles south of Salt Lake City, calls for 1.5 million square feet of building space on 120 acres. Due for completion in 2013, NSA says its new data center will be used to provide intelligence and warnings related to cybersecurity threats.
Without knowing more of the details of NSA's requirements, it's hard for an outside observer to assess whether this particular data center project is absolutely necessary. However, one thing that jumps out is the cost of the project, with estimates ranging from $1.5 billion to $1.6 billion. As a point of comparison, Google budgets about $600 million for one of its state-of-the-art data centers.
Part of the problem with government-owned data centers is that they represent long-term fixed costs. Once the concrete dries and the buildings go up, good luck getting rid of them.
It doesn't have to be this way. U.S. Postal Service CIO Ross Philo told me in a recent interview of tentative plans to build a next-generation data center, in addition to the two that the Postal Service already operates. But Philo admitted that the Postal Service's financial circumstances (losses amid flat revenue) don't make the project feasible right now. Chalk one up for fiscal reality.
And it's possible to stop a data center project even after the bull dozers have cleared ground. In August, I drove out to a data center under construction by Amazon.com in Boardman, Ore., only to find that the project had been put on hold indefinitely (see for yourself). Reason: The economy.
Some federal agencies seem to be leaning in the right direction. Homeland Security, for instance, is looking to consolidate 24 data centers into two.
But putting the brakes on the 10-year trend of data center proliferation will be a test of Kundra's influence and effectiveness. Let's see how many data centers Uncle Sam has a year from now.
John Foley is editor of InformationWeek Government. Follow me on Twitter at @jfoley09 and let me know what you think about the federal government's data center strategy.
InformationWeek Analytics has published a guide to the Open Government Directive and what it means for federal CIOs. Download the report here (registration required).
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?