For years, enterprise data centers have been stagnant pools of fragile technology with a tendency to fail spectacularly and scale poorly. We wasted vast amounts of money on slow-moving, monolithic, capital-intensive services. It took the explosive success of the likes of Amazon, Google, Facebook, and Salesforce.com to puncture the belief that the data center status quo was the best we could do. These providers threw out risk-averse ITIL principles, knocked down technology silos, and introduced fresh design and operational principles. Instead of striving for stability, they embrace failure and constant change.
These new approaches to purchasing and operating infrastructure have been wildly and publicly successful. We recently profiled six companies that are taking innovative infrastructure ideas and running with them. Fidelity Investments' new Nebraska facility uses open source code and standardized Open Compute hardware. Bank of America is sizing up software-defined infrastructure and modular data centers, with highly standardized commodity hardware. It's only a matter of time before smaller shops take note of initiatives such as Facebook's Open Compute Project. The social media giant demonstrated how to lower data center costs by as much as 80% using low-cost hardware and radically different design principles. Google, too, is driving efficiency and reducing power consumption, achieving power usage effectiveness of 1.1 compared with a typical enterprise PUE of around 1.7. It's shown that even simple changes can have a dramatic effect on the bottom line. Consider that both Google and Facebook run data centers at temperatures of 90 degrees F, saving 60% in operating expenses on power for cooling and spending much less capital on chiller and airflow units.
Trickle-down effect? We see hints of data center strategy changes in this year's InformationWeek State of the Data Center Survey, drawing on 217 respondents involved with data center management or decision-making at organizations with facilities that are 1,000 square feet or larger. Data center teams are virtualizing aggressively, are focused on standardization, have software-defined networking on their radar screens, and are holding the line on use of public cloud.
Demand from the business for capacity is, as usual, rising faster than spending, but that doesn't mean respondents are slavishly focused on bottom-line cost. We asked which of 11 application infrastructure priorities are among their top concerns; while reliability and availability and security and data protection are cited most often, the percentage citing "lowest upfront cost" among their top concerns plummeted by half in just one year. Those citing "lowest operational cost for facilities and management" dropped six points. Getting a good fit for the business unit's or application owner's requirements gained six. CIOs are finally escaping a dreary focus on reducing costs and are looking to improve services.
Money vs. demand Fully 73% of respondents say that, compared with last year, demand for data center resources will rise; 19% expect demand increases of more than 25%. That's no surprise -- the future of business is digital. But don't assume this equates to rising budgets. On the contrary: The percentage able to spend 30% or more of the total IT budget on data center facilities, hardware, and operations is down to 38% from 43% in 2013. That fits with the expected anemic growth of overall IT budgets. For example, IDC lowered its prediction on global IT spending for 2014, to around 4%.
This lack of investment in structural upgrades is a consistent theme in InformationWeek surveys. Our 2014 Data Center Convergence Survey, showed that demand for convergence is high but lack of budget limits opportunities. Same deal in our State of Storage and Private Cloud surveys this year, where reducing capital costs is a key driver in purchasing. Yet "no money" is no excuse when it comes to the demands of business units.
IT continues to pursue virtualization to meet these demands, as we'll discuss. But IT is not blindly moving to a "virtualize everything" strategy; 35% of respondents choose whatever infrastructure approach is best for the application. Many application vendors, amazingly, still do not certify or support their products on hypervisors. It's an open question how long IT will put up with that state of affairs.
In its ninth year, Interop New York (Sept. 29 to Oct. 3) is the regional event for the Northeast IT market serving strongly represented vertical industries, including financial services, government, and education. More than 5,000 attendees attend to learn about applications, the business of IT, cloud collaboration, infrastructure, mobility, risk management and security, and SDN, as well as explore 125 exhibitors' offerings. Register with Discount Code MPIWK to save $200 off Total Access & Conference Passes.
Greg has nearly 30 years of experience as an IT infrastructure engineer and has been focused on data networking for about 20, including 12 years as Cisco CCIE. He has worked in Asia and Europe as a network engineer and architect for a wide range of large and small firms in ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Cybersecurity Strategies for the Digital EraAt its core, digital business relies on strong security practices. In addition, leveraging security intelligence and integrating security with operations and developer teams can help organizations push the boundaries of innovation.