Big data growth forces more companies to adopt containerized data centers and other solutions to meet storage demands.
Except for balloons and perhaps some reputations, few things expand exponentially when they're filled with something insubstantial. Data centers appear to be another exception.
Corporate data centers have been "growing exponentially for years," according to Data Center Journal, to keep up with the demands of virtualization, cloud computing, and the growth of e-business. This year data center growth took another big leap, driven largely by the need to store ever-more-vast amounts of digital data.
In fact, growth has been so fast that half of data center managers polled for a June Data Center Journal report said their companies were suffering moderate to severe "data center sprawl"--a term describing unregulated growth that forces data centers to expand capacity by adding new servers or storage units piece by piece. (The preferred method, of course, is to build a new data center that's capable of handling greater capacity demands to avoid overloading the electrical or cooling systems of the current center.)
Most data center growth is driven by cloud computing, virtualization, mobile computing, and other trends within corporate IT--trends that may cost-justify their own existence but that still add greater demands on the capacity of data centers.
Underlying all this growth within the data center is, of course, one common element: data. Virtual servers, software-as-a-service, cloud computing projects, even mobile computing drives more data into the data center in the form of stored virtual-machine images, replicated copies of cloud-based data, and copies of documents and applications for use by mobile workers using smartphones.
The drive toward big data analytics and collection of data that was either ignored or discarded in past years has driven data growth rates even higher--as much as 25 percent higher, according to a quarter of those surveyed.
More than half of senior-level IT executives (54 percent) predicted that within the next two years, they'd have to expand corporate network bandwidth by 50% or more to keep up with the growing demands of big data, according to a survey of 1,549 senior IT managers in the U.S. and Europe conducted by network infrastructure provider Emulex. These managers also told Emulex they'd have to expand storage capacity by at least 50% during the same period.
Demand is growing so quickly that data center construction projects are getting backed up or delayed at unprecedented rates, according to Data Center Journal. As a result, record numbers of large companies are now opting for "containerized data centers," prefabricated structures containing a pre-selected mix of data center storage and computing equipment and power and cooling hookups to keep them running.
The first containerized data centers were introduced in 2005 and were unpopular for several years, according to IMS Research. That has changed in the past two years--almost exactly the time big data became a clear trend within IT.
Containerized data center design is a cost-efficient, controllable way to expand data center capacity. A disadvantage, however, is that it offers generic product selection, limited to those designed to be "generator-ready," according to Jun Yang and Patrick Kenny, senior consultants at Infrastructure Factor Consulting, Inc.
IT managers responding to Data Center Journal's survey said they were near the breaking point in both data center capacity and their ability to expand it.
They're far from alone. The Ricoh Document Governance Index 2012--an annual study of the factors affecting the storage and management of paper or electronic documents in the enterprise—anticipates increasing demand for big data storage and compute power to run complex analytics.
Out of 1,000 C-level executives Ricoh interviewed in Europe, 91% cited good data and the ability to get clear answers from it as the single biggest factor preventing them from running their businesses more efficiently and profitably. During the past three years, big data has changed the priorities of the majority of European businesses, which are now focusing on taking and managing business risks over cutting costs, increasing efficiencies, or decreasing their environmental impact. (In 2009, cutting costs was the top goal of 67% of European business managers, according to the study, compared to 43% today.)
Even small IT shops can now afford thin provisioning, performance acceleration, replication, and other features to boost utilization and improve disaster recovery. Also in the new, all-digital Store More special issue of InformationWeek SMB: Don't be fooled by the Oracle's recent Xsigo buy. (Free registration required.)
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.