IT spending will hit $34 billion by 2013 as companies upgrade and adapt existing infrastructures to meet the demands of big data, Gartner research predicts.
Big data is a big deal. Marketers and corporate strategists hope it can provide insights on customers, while IT managers struggle with how to manage all that data within the parameters of their budget and staff.
What IT professionals will do, according to a Gartner study published today, is spend nearly half of all IT resources during the next few years in an effort to adapt large, complex IT infrastructures to the demands of big data projects.
The result is a misleadingly small market for big data projects, which will account for about $4.3 billion worth of corporate IT spending worldwide during 2012, according to the report, titled Big Data Drives Rapid Changes in Infrastructure.
Direct spending is only a fraction of the total, however.
Few companies plan to add big data capability by ripping out and replacing existing products, Gartner found. Instead, most companies will add a few new products while beefing up their storage, databases, servers, and other IT resources to handle the rigors of huge databases--databases that are updated constantly and that include information so complex that most data specialists have traditionally viewed it as impossible to parse or analyze effectively.
The result: a chain reaction of upgrades and adaptations that will account for a total of $28 billion in IT spending worldwide during 2012, and $34 billion in 2013, Gartner's report predicts.
Much IT spending now is focused on gathering and analyzing business transactions, data from server logs, and email among employees and with business partners, according to a study from IBM and the University of Oxford, which was also published today. (Download a PDF of the 2012 Analytics Study here.) That spending focuses on the need to integrate and analyze both unstructured textual data and machine-to-machine data, key challenges of early big data projects.
So far, however, only 28 percent of global organizations have pilot or production-quality big data projects underway. Meanwhile, 47 percent are in the planning stages, and 24 percent have no big-data projects underway, according to IBM/Oxford's survey of 1,144 business and IT executives in 95 countries.
What do you want to know? How can you find out?
New insights on customer behavior is the Holy Grail of big data, but few organizations have data management and business intelligence systems that can scale high enough or change quickly enough to handle the demands of big data, according to IBM/Oxford's report.
Getting existing infrastructures up to speed with new definitions of what data is supposed to be will drive not only huge spending increases--a total of $232 billion by 2016--but also a new way of thinking about data and analytics, according to Mark Beyer, a VP of research at Gartner and lead author of the report.
IT spending driven by the demands of big data will continue through 2018, when expectations about the size, composition, and potential of corporate data rise to the point that "big data" becomes simply "data"--probably around the year 2020, according to Beyer.
The idea of big data was new enough in 2011 that it created an entirely separate set of reasons to spend money on data management products. But it didn't take long for most IT managers to realize big data required more space, more power, and more flexibility in analytics, storage, and data management--not a completely new set of capabilities, according to Beyer.
The primary goal of big data is to gain insight from data that had been previously inaccessible. Increases in computing power, storage capacity, and data mining capabilities now make it possible to analyze information about customers culled from petabytes of social network chatter, web-server logs, and other peripheral data sources.
"Big data requirements will gradually evolve from differentiation to 'table stakes' in information management practices and technology," Beyer wrote. "By 2020, big data features and functionality will be non-differentiating and routinely expected from traditional enterprise vendors and part of their product offerings."
In-memory analytics offers subsecond response times and hundreds of thousands of transactions per second. Now falling costs put it in reach of more enterprises. Also in the Analytics Speed Demon special issue of InformationWeek: Louisiana State University hopes to align business and IT more closely through a master's program focused on analytics. (Free registration required.)
InformationWeek Elite 100Our data shows these innovators using digital technology in two key areas: providing better products and cutting costs. Almost half of them expect to introduce a new IT-led product this year, and 46% are using technology to make business processes more efficient.
The UC Infrastructure TrapWorries about subpar networks tanking unified communications programs could be valid: Thirty-one percent of respondents have rolled capabilities out to less than 10% of users vs. 21% delivering UC to 76% or more. Is low uptake a result of strained infrastructures delivering poor performance?
Join us for a roundup of the top stories on InformationWeek.com for the week of December 14, 2014. Be here for the show and for the incredible Friday Afternoon Conversation that runs beside the program.