As detailed in a recent column by InformationWeek's John Foley, the federal government has a data center addiction. Despite ongoing chatter about virtualization, cloud computing, consolidation and shared-services models, the number of federal government data centers has grown over the past decade from almost 500 to approximately 1,200 with more on the way.
Having led the U.S. Postal Service’s efforts to consolidate its data centers -- going from 10 facilities to two -- I know first-hand the significant challenges federal CIO Vivek Kundra faces in breaking this cycle. However, I can also assure him that the potential benefits -- in terms of cost savings for facilities, networking, security, electricity, staffing and a capital refresh every four years -- makes this a battle worth pursuing as this was a key element in our strategy to reduce annual IT spending at USPS by over $ 500 million.
The challenge facing Kundra is the highly decentralized way in which data center decisions are made. For example, I never needed to go beyond my own agency to open or expand a facility. Instead, each federal department or agency has historically justified its internal decision to develop an additional facility on the basis of its "specialized" or "unique" business requirements. Typically, these might relate to information security, the need to maintain continuity of operations, or a desire to have direct access to a specific resource or user community. In other cases, a data center strategy simply evolved with little long-term planning as additional facilities were quickly procured as existing centers reached capacity. Empire building and pork barrel politics may have played a role as well.
However, with the technology services that are available on-demand today, it’s hard to see how these arguments make sense anymore. Simply put, relying on a network of 1,200 disparate data centers, each designed and provisioned differently, adds unacceptable cost and risk to our federal IT operations. Furthermore, it's at odds with our efforts to be more energy efficient and green in general. It would certainly be a lost opportunity to have created the role of the federal CIO and not empower him with the ability to make the meaningful changes that we know are required.
With the increasing standardization of IT, the time has come to pursue the more rational and sustainable data center strategy that Kundra is advocating. A century ago, users recognized the diminishing returns of trying to provide their own electricity. The same is increasingly true today regarding raw computing power. Instead of managing hardware that quickly becomes a commodity and an asset liability, CIOs should focus their efforts on leveraging information, enhancing business processes, and utilizing differentiated technologies to improve their organization's performance.
Outsourcing the nation's IT infrastructure would be the mother of all procurements, but it can be done, and with competition comes significant cost savings for the government as well as the ability to spur new innovation. For those who cite security concerns, think about the information that is currently being carried across leased telco lines. Furthermore, contractors are already responsible for maintaining many of our key government systems.
Let me also be clear that we need to go beyond application outsourcing. We need to take advantage of emerging cloud computing services to provide government with the additional flexibility and extensibility that it needs. Consider the challenges that agencies regularly face, such as the IRS's need to ramp-up for tax season or the Postal Services' requirements for additional resources to handle the Christmas rush. In this new environment, what I'm calling the enterprise computing-on-demand era, these requirements won't be met with bulldozers and cranes, but with the click of a mouse.
Some may point to the occasional outage of consumer or other services to argue that government IT is too critical to outsource. I argue the opposite. These "lite" services often lack meaningful service level agreements and related investments in redundancy and failover, and they’re often free. Yet, Gmail still seems to outperform many agencies' Lotus Notes or Microsoft Exchange implementations in terms of reliability. Imagine what would happen if we made a real commitment to this paradigm and backed it with enforceable SLAs?
In terms of making this vision a reality, what’s needed is leadership from the Office of Management and Budget. Currently, investments in major applications and IT systems are vetted for their appropriateness (e.g., do they fulfill or advance a specific mission?) and viability (e.g., are the program management elements in place to ensure success?). We need to provide similar scrutiny in terms of how departments and agencies build, expand, procure, and manage data center resources. This is an issue that is too often buried within other requisitions.
As an initial goal, I propose that each department and agency be required to show in 2010 how they would leverage a shared services or enterprise computing-on-demand model to meet 50% of their data center requirements within five years. While some exceptions for specialized requirements may be justified, it's fair to say that half of the core operations and processes of most agencies are fairly generic. Coupled with advances in productivity and efficiency, I believe it's feasible to reduce the 1,200 government-owned data centers to 250 over the next 10 years.
How would we ensure success? We need to establish clear metrics and expectations for progress and combine them with transparency so that these efforts can be continually assessed. Furthermore, we need to ensure that performance is tied directly to the CIO's annual assessment and that they're rewarded for success. Net/net: CIOs should be focused on strategic business initiatives impacting their organizations, not on the day-to-day performance of server farms.
What will make this approach successful is a focus on outcomes and results and not merely on processes. Rather than trying to prescribe the specific approach that departments and agencies should take, we need to simply ensure that there are real consequences for not achieving these objectives.
Given the opportunity, I’m confident that government IT pros will capitalize on this to really transform how IT meets their organization’s computing requirements. In doing so, we will all benefit from greater alignment with the agency's mission, the standardization of key requirements, and cost savings realized through greater operational efficiency and reduced capital spending.
Robert L. Otto, executive VP of advisory services with Agilex Technologies, was CIO and CTO of the U.S. Postal Service from 2001 through 2007.