Local, state, and federal agencies are implementing new technologies and tactics in their pursuit of data center consolidation.
Government IT pros looking to achieve streamlined data center operations should consider what's going on in the private sector. Technologies such as server virtualization and data deduplication have moved into mainstream use, along with high-density hardware and innovative cooling schemes. Other technologies, like solid-state drives and fast, converged Fibre Channel over Ethernet and 10 Gigabit Ethernet network protocols, still remain distant on the radar for some, but changes are coming.
For many, the first step is standardization. "Implementing standard applications across the agency reduces application and server footprints," says Larry Grossman, program director for IT infrastructure optimization at the Federal Aviation Administration's office of the CIO. In response to the Federal Data Center Consolidation Initiative, the FAA is working on physical data center consolidation as well as centralizing applications and servers in a few locations.
Updating an aging infrastructure begins with establishing a technical baseline by identifying common technology platforms, says Imran Chaudhry, CIO for the District of Columbia Criminal Justice Coordinating Council. "We're upgrading to make sure we don't have a mishmash of hardware systems and databases," says Chaudhry, whose agency facilitates the coordination of information exchange between all D.C. criminal justice and law enforcement entities.
Jim Landers, operations branch chief of the IT services office at the Centers for Disease Control, says his agency has combined 34 server rooms in multiple buildings into three data centers over the past six or so years, with a focus on consolidating commonly used services. "People generally need Web services of some sort, and a staging environment," says Landers.
The state of Delaware had myriad systems, requirements, and skill sets, says Douglas Lilly, lead telecom technologist in the state's Department of Technology and Information. An initial push toward a centrally managed network in the late '90s was the state's first foray into consolidation. The IT team decided to move separate e-mail domains and DNS and other services into one location. "It proved to be a good model," says Lilly. "Now everything's managed by a handful of people."
Runaway storage volumes must be factored into any consolidation effort. "Replacing enclaves with shared storage across application environments is one of the biggest areas of low-hanging fruit," says Michael Biddick, CTO of IT consultancy Fusion PPT. Moving to a networked storage environment could mean a storage area network, clustered network-attached storage, or a virtualized storage setup--anything that allows a common pool of storage that can be allocated across applications.
The D.C. Criminal Justice Coordinating Council is in the process of implementing a Fibre Channel SAN. "We've got data residing in different locations, on different hardware," Chaudhry says. "We're working to employ a uniform standard. SAN technology affords us that."
Before the move to consolidate and centralize storage across agencies in the state of Delaware, divisions would go out and buy their own gear. The IT department decided to centrally manage storage using primarily high-powered Fibre Channel SANs, then offer storage as a service to agencies.
Virtualization is key to trimming server bloat. Without it, "you're either going to have a warehouse full of disk arrays or a warehouse full of servers, and neither is acceptable," says Mark Stewart, enterprise storage and SAN manager at HQ Air Force Personnel Center at Randolph AFB in Universal City, Texas. Virtualization was a huge part of a consolidation at the center, which moved 350 data centers into one mega-facility at Randolph. Stewart and his team manage about 760 TB of data there.
Delaware's consolidation project has been driven in large part by blade servers, which, when combined with virtualization, create denser data centers. Lilly estimates that the agency cleared out one-third of its data center when it consolidated servers onto Dell blades and virtualized.
Develop A Road Map
Agencies looking to start fulfilling the Federal Data Center Consolidation Initiative mandate should follow the directive's process--conduct an inventory of physical assets and applications, develop a consolidation road map that identifies areas where server virtualization or cloud computing may aid optimization, and implement ongoing monitoring. In addition to that, we'd add four recommendations:
>> Evaluate how deduplication technology can help. The Air Force's Stewart chose a deduplication system from Data Domain (owned by EMC) because he wanted in-line dedupe that could concurrently write the data to a disaster recovery site. In-line deduplication uses Intel processor cores to do the compressing and fingerprinting before it gets to disk.
>> Use the right disks for the job. For server consolidation and virtualization, solid-state disks have their place, says Stewart, despite the price premium. The Air Force Personnel Center has SSDs in all of its virtualized servers.
>> Connectivity is key in a consolidated environment. The CDC's Landers points out that the physical infrastructure for a virtual platform needs to have higher bandwidth to support higher numbers of virtual servers. A network upgrade may make sense to connect outlying offices. The CDC is also considering WAN acceleration to improve the experience for users in remote offices.
>> Consider the cloud. Consolidation projects are hard work and leave bigger, denser, and more complex data centers in their wake. Government tech leaders are eyeing cloud computing as a way of reducing IT footprints while diluting some of the concentrated risk that a single center carries.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.