Does IT Really Care About Big Data?
Surveys that show CIOs dislike big data are wrong. There are bigger reasons for IT to resist taking on another disruptive technology.
Recent surveys show IT and business unit managers are more worried than eager about big data analytics-- but those surveys are probably misleading, according to at least one expert.
There are rafts of surveys showing the huge market potential of big data analytics and the desire of many companies to get their hands on answers they can't get any other way, according to Daniel Castro, senior analyst at the Information Technology and Innovation Foundation (ITIF), whose focuses are cybersecurity, e-government, and IT in healthcare.
A seminal study on big data by McKinsey and Co., for example, found that analysis of big data sets could enhance the productivity and competitiveness of many companies, save more than $300 billion in healthcare alone by increasing the industry's efficiency, and help retailers increase profit margins by as much as 60%.
Another study, "The Future of Big Data" by the Pew Internet and American Life Project, reiterated the findings. It quoted Microsoft chief strategy officer Craig Mundie and Wal-Mart CIO Rollin Ford predicting that a "data-centered economy" in big data analysis will help both government and corporate organizations avoid big mistakes and waste by pointing out persistent errors in practice or belief.
Many experienced IT managers either ignore those predictions or disregard them because they sound too good to be true. They believe all the potential benefits of big data depend on large, expensive changes in the way corporations collect, buy, store, manage, and analyze data, Castro said.
[ Big data is a big deal for sales and marketing organizations. Learn 5 Ways To Benefit From Big Data. ]
"[Big data] is a very new space. People are moving quickly, some faster than others, and mistakes are being made," Castro said. "These [big data] projects are very complex systems; we have to create increasingly complex systems to manage all of it, and that process is still far from finished."
One study that casts doubt on the success of big data shows tight-budgeted IT departments in midsized companies are more willing to support cloud computing than they are to support big data.
Cloud computing, while still new and unfamiliar to many in IT, has been around a lot longer than big data, giving IT executives more time to get used to the technology and the cost, according to Frank Gillett, VP and principal analyst at analyst company Forrester.
The survey, released Tuesday by market-research firm TheInfoPro, showed that 36% of midsized companies will have more to spend on storage this year than in 2011, compared to 47% last year. That drop--really a slowing of growth rather than an actual reduction--contrasts with large companies, whose networked storage capacity will grow an average of 26% his year, the survey showed.
The primary reason for increasing storage spending was the expansion of server virtualization projects, two-thirds of which rely on fibre-channel storage area networks to store the server images, data, applications, and other bits that used to live on hard drives attached to physical servers.
A subset of respondents are increasing storage capacity in an effort to deliver cloud-like internal data storage and server provisioning, according to Marco Coulter, TheInfoPro research director in charge of storage.
The percentage of companies expanding storage to support internal clouds is uncertain because most respondents said they were focused on optimizing storage for virtual servers--a prerequisite for internal cloud platforms--and few were explicit about any plans for cloud.
Nevertheless, 56% were confident enough about big data to tell TheInfoPro they had no plans for big data projects after next year.
That doesn't mean they won't build big data-based decision systems; it just means both big data and their employers' plans for it are uncertain enough that budget approvers would not commit to big data projects past 2013. Part of the reason may be that large companies are fundamentally unprepared to deal with the collection, storage, and use of giant databases, according to a study released Monday by revenue-management application vendor PROS.
The company surveyed more than 100 aftermarket parts-manufacturing companies. It found 65% not only found themselves unable to deal with the complexities of big data, but named that inability as their leading challenge.
Some of the problem is technological--the pricing and marketing managers most likely to benefit from big data prefer to work on spreadsheets or other apps that can't handle either the volume or complexity of analyzing big data sets, according to Patrick Schneidau, VP of product marketing at PROS. Sixty-two percent of respondents said they rely too much on spreadsheets to give them up; half said spreadsheets are the only app they use to calculate or set prices.
Justifying the cost of a big data project requires that end users be actively interested in pursuing one, which just wasn't the case in most of the 255 companies surveyed, Coulter said. Only 21% of companies had deployed any big data before 2012, the survey showed, while 7% plan to continue using it past 2013.
Rather than chase big data as a way to identify new customers, new opportunities, or better data to support pricing decisions, respondents said the main targets for their storage spending were:
-- To support server consolidation and virtualization;
-- To meet increased needs of existing business applications;
-- To overcome poor archiving practices now or in the past;
-- To support new business apps; and
-- To improve disaster recovery and the retention of backup data.
Those big data luddites may be shooting themselves in the foot, however, according to the Information Security Forum, which issued a report Wednesday showing analysis of big data that lists threats, risks, and security incidents can cut overall security risk by identifying the most serious security risks, rather than those that simply get the most attention.
Most companies analyze security data to identify threats, but only 20% analyze incident data to improve security or performance by identifying data that is the subject of the most (legitimate) requests for information. These early alerts most likely to lead to hardware failures or identify threats or penetration techniques might be undercounted as serious threats to overall security.
Weakening the appeal of big data is the increasingly intense fear among senior-level IT managers that their companies' most valuable data is vulnerable to loss or corruption during disasters, according to a survey from big data management vendor Quantum.
According to Quantum's 2012 IT Manager Survey, 90% of IT decision makers worry they will lose data during disaster-recovery operations--an increase of 3% over last year.
The most common disasters are virus attacks, operating system failures, and problems with data archives, rather than the traditional fire/flood/natural disaster scenarios for which most disaster-recovery plans are designed.
The overall picture is of a market divided--between IT executives optimistic about the benefits of more analysis versus those who think big data will lead to big problems, and also between business unit executives who see the potential in ever-more-granular knowledge of customer motivations versus IT executives exhausted by virtualization and cloud projects who resist leaping into a third new technology with the potential to fundamentally change the way IT does its job," Castro said.
"There are a lot of problems and risks with big data that are just extensions of risks IT has always dealt with," Castro said. "You have to know where the information is coming from, the policies of the provider, whether the data is being backed up, where the information is and who has access to it--all the questions CIOs ask about new technology, but haven't had time to get answers yet."
About the Author
You May Also Like