In Depth: In Slaying Legacy IT Costs, The Data Center Is A Good Place To Start
Companies typically spend 80% of IT budgets on maintenance. Attacking data center costs can let companies shift more of that spending to new projects. You know the drill--you're spending way too much on maintenance, not enough on what really matters. Here's one way to get on the innovation stick.
Is your IT department a center of innovation? Or is it a cost center or, worse, a money pit that requires a growing budget just to keep the lights on and the hard drives spinning?
The answer may hinge on whether you've cracked the 80-20 conundrum of IT spending--that around 80% of the typical IT budget is spent on management, operations, and maintenance, and only 20% goes to new technology and innovation. For companies serious about improving that ratio, one of the first places to look is the data center.
The business need is obvious. When $8 out of every $10 goes to routine, day-to-day operations, it's easy to understand why many businesses view IT as a support service and not an innovation engine. IT managers who aren't actively working to change their spending patterns shouldn't be surprised when they're not invited to C-level strategy sessions and instead are relegated to the kiddie table.
The tactics to change that 80-20 ratio in the data center are well known: inventory, standardize, consolidate, virtualize, automate, and enforce measurement-driven best practices. But they're hard to accomplish, and there haven't been many good technology tools to help--until now. Most of the major systems management vendors, including BMC Software, CA, Hewlett-Packard, IBM, and Opsware, offer products that reduce the drudge work that consumes a lot of IT staff time and much of the IT budget.
Many of these management systems have tools that automatically inventory IT assets and map them to critical applications and business processes. Change or configuration management databases, for example, track each server and PC and their applications, configurations, access rights, and other key characteristics. If a problem occurs, say with a software update or security patch, the database can be used to roll back a server or PC to an earlier and stable configuration. Once a company settles on a standard configuration for servers and PCs, these databases also can be used to quickly set up and configure dozens of computers, saving the IT staff from doing it manually.
Companies have been able to create virtual servers for years, but new management tools make it far more practical even for some very small businesses to virtualize computing and storage resources and dynamically allocate them as needed, so financial systems get more horsepower at the end of a quarter and retail systems get more juice during the back-to-school selling season. They can automatically enforce security policies and monitor network and application access. And they can let you know when and where you need to add more resources by doing trend analysis of usage rates. The goal is not only to cut IT infrastructure costs but also to provide guaranteed service levels to the entire business.
No space, no power -- no problem for Shimmin.
Many companies are consolidating and virtualizing their servers simply because they've run out of space as they've followed the old one server, one application approach. Michael Shimmin, data center coordinator for University Health Care at the University of Utah, says 80-20 is a pretty good estimate of how his IT budget is divided between maintenance and new projects. But it would be much worse if the IT department hadn't consolidated around HP blade servers and virtualization software from VMware.
The university's health care IT department a year ago faced multiple problems in its data center, a 4,000-square-foot facility that serves the university hospital and clinics, five regional hospitals, and two dozen outpatient clients. It had no room to grow, and its electrical system couldn't handle much more capacity, yet it needed 80 new rack-mounted servers. Shimmin knew the building couldn't handle that, and it would have taken five years just to secure the funding to build a new consolidated data center.
Shimmin developed a plan with HP, VMware, and IT reseller Avnet to move the data center to blade servers powered by dual-core processors. The university funded an update of the facility's electrical system to allow more power. And Shimmin began consolidating around HP blades, which let him pack more servers in a smaller space. He also used VMware software to create about 20 virtual servers on each physical server.
The department also bought HP's OpenView Performance Insight, a central console that lets the IT department monitor usage rates on each physical server to identify the best candidates for virtualization.
There's more work to do. Shimmin's investigating new cooling options from American Power Conversion, Liebert, and other vendors to handle the heat generated by the densely packed blade servers, as the department tries to keep this data center in operation until the university funds a major overhaul.
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?