With just 8% of respondents to our 2012 survey expecting to build new facilities and constrained budgets the No. 1 impactful trend, it's clear enterprise IT's transformation to service provider is in full swing.
Assessing the state of your data center by looking at the building is like judging a car by its paint--you can throw a classic flame job on a 2001 Honda Civic, but you'll still have just four cylinders under the hood. Instead, focus on the performance the business needs now and what it will require in the near future-- remembering that gigabit-speed WANs and cloud services are now the norm and new co-location data centers let you rent the equivalent of a Shelby Mustang.
Every IT team needs to be skilled in weighing in-house data centers vs. various outsourcing options, including the cloud, and balancing the economies of standardized x86 servers against the performance of optimized systems. That's why this year we rebuilt our annual InformationWeek State of the Data Center Survey from the tires up to focus on application infrastructure rather than facility design.
Not that our 256 respondents, all of them involved with management or decision-making at companies with data centers of 1,000 square feet or larger, aren't still worried about their physical plants. Hardware packing more performance punch into every cubic inch has ratcheted up power and cooling demands to the point where simply renovating old facilities is often not an option. And big construction projects are harder than ever to justify, especially as the options for cloud infrastructure and colocating your gear in a third party's data center grow.
There's also a major operational shift afoot as the "customers" of enterprise data centers flex their muscles. Standardization is great, but who hasn't had a business unit specify that it absolutely, positively needs a software package that only runs on Sun Sparc or IBM Power? Or maybe the development team bought some specialized hardware and wants to move it into your data center.
This puts CIOs in a tough spot as they try to improve efficiency, lower costs, and optimize performance through a tightly controlled set of standardized hardware and software, while accommodating customized infrastructure. The rise of virtualization as the standard server platform for new applications, and the consequent use of commodity x86 hardware for an increasing share of workloads, means IT provides more value by efficiently buying and deploying "good enough" x86 servers than by painstakingly selecting "just right" hardware.
Respondents put constrained budgets at the top of our list of 16 trends that will have the greatest impact on data centers over the coming 12 months. Strapped IT teams are torn between standardization and efficiency vs. customization and performance. A primary goal of this year's survey was to learn at what level of the data center hardware and application infrastructure IT is imposing standardization. The OS? Application platform? Are purpose-built appliances worth supporting for high-value, frequently used applications like data warehouses and business analytics?
We need to come up with a guiding strategy here. Thirty percent of respondents spend less than 20% of their total IT budgets on data center facilities, hardware (servers, storage, networks), and operations; 60% spend less than 30%. Yet just 7% say they'll consider a colocation facility, even as 15% say demand for data center resources will rise by more than 25% in the coming year; 58% expect less-dramatic increases. Only 5% expect any decrease in demand.
Although most respondents have some hardware and business application standards that could help contain costs, they're not rigid. A quarter bend over backward to meet new requests, evaluating hardware and software on a case-by-case basis and picking the best fit available.
In terms of hardware, respondents are imposing limits: 63% of companies hosting their own applications use one or two vendors and a limited set of hardware. Only 29% pick the best hardware based on the application. Despite feverish marketing, x86 servers are a commodity, with any differences at the margin. If Dell introduces a new system with Intel's latest CPU and chipset, you can bet that Cisco, Hewlett-Packard, and IBM won't be far behind, meaning there's little room for white-box or second-tier vendors.
When it comes to software, our survey finds a similar reliance on standardization. Forty-two percent have a standard platform with no or limited exceptions. Twenty-three percent of respondents' organizations--and we're guessing midsize businesses make up a big chunk of this group--adopt one or two vendors' application stacks, say Windows Server and a core set of Microsoft enterprise software like Exchange, SharePoint, and SQL Server. Another approach is to pick a couple of vendors with broad product portfolios, say Microsoft and Oracle or IBM and SAP, and let application owners find the best fit within this menu of software choices.
Interestingly, just 31% take a best-of-breed approach, individually evaluating every request to find the best software, and few (2%) develop most of their own applications. One option that got zero takers: "We seldom buy new software; we encourage application owners to find the right SaaS product for their needs." Clearly, those with sunk data center investments aren't ready to turn users loose in the public cloud.