How To Bridge Big Data's Information Gap

Companies must fight the growing gap between people who gather big data and those who can put it to practical use.

Kevin Fogarty, Technology Writer

October 12, 2012

4 Min Read
InformationWeek logo in a gray background | InformationWeek

There is a growing information gap in U.S. businesses, but surprisingly it's not between those using sophisticated analytics and those that do not.

The gap is between those who gather and analyze customer behavior or business process data and those who can put the analysts' findings to practical use, according to a survey conducted in September by the MIT Sloan Management Review, house publication for MIT's Sloan School of Management.

The online survey of respondents with senior-level management titles shows two-thirds or more are big fans of big data and analytics, but don't quite know what to do with it.

-- 67% of survey respondents say analytics provide a competitive advantage.

-- more than half responding to Sloan's survey say analytics can improve a company's organizational capabilities.

-- 65% said their organizations are good at capturing data that would be relevant to the questions they'd like to answer.

-- 46% said their companies are good at distributing potentially valuable insights to front-line employees able to put them to use.

[ Learn 8 Ways Marketers Can Profit From Big Data Investments. ]

The unnamed CEO of a multinational outsourcing and technology provider told Sloan Management Review his company collects 80% of the data relevant to its own workflow and operational efficiency, but is able to use only a fifth of the data it collects.

The Difference Between Answers And Irrelevancies

Data often go unused because the answers being provided by data specialists don't match the questions of those responsible for making changes using that data, Ganesh Natarajan, vice chairman and CEO of consultancy Zensar Technologies told Sloan Management Review (SMR).

Even when the answers are relevant, the format in which they're provided or the tools needed to access them are too complex or confusing for front-line employees, according to Arnab Gupta, founder and CEO of predictive-analytics vendor Opera Solutions.

It is always possible to convert data to new formats or reports that make the information more accessible. The question is whether being able to understand only numerical data leaves decision makers only half advised, according to Scott Keeter, director of survey research at the Pew Research Center for the People and the Press, which has run several surveys and research projects on the impact of big data.

Qualitative analyses are often easier to turn into practical decisions than quantitative, at least for those who are not data specialists, Keeter says.

Of course, much of the justification for big data analyses is to try to extract qualitative evaluations from quantitative data, which often requires that end users look at the results using supplementary tools that may contain analytics of their own, rather than simply the ability to digest or create visual representations of complex data, Keeter says.

The most common of those supplementary tools in corporate America? Microsoft's Excel spreadsheet.

Relying on an additional set of tools whose role is simply to convert data that have already been analyzed into formats users can understand is inherently less efficient than simply making the results easier to read, according to Michele Goetz, enterprise architecture and applications analyst at Forrester.

It may be appropriate to have data scientists do in-person briefings or other handholding for individuals or small groups, but that process wouldn't scale in a large organization.

It would be much simpler to apply data governance policies--which define standards, protocols, quality assurance rules, and other ways to ensure data can be delivered in a consistent, accessible way to end users, according to Goetz.

So far, however, there is little evidence of a general rush toward data governance policies for big data, Goetz wrote.

The underlying problem isn't tools or protocols, it's the immaturity of the whole big data market, which is working on the problem of how to make big data accessible, but won't have it beat for a couple of years, according to Shalini Das, research director, for the Washington, D.C.-based CIO Executive Board.

About 82% of corporate employees are knowledge workers who could potentially use answers from big data systems, but 85% of the big data analyses available cannot be analyzed with tools like Excel on which end users rely.

"Saying you need to restrict information to a specialized set of individuals or tools because the data are too complex is possibly not the way to go in the future," Das says. "That just creates another bottleneck; it would be more effective to create a basic degree of analytic maturity with all employees so they're more able to deal with that data.

"Not everyone has to do data modeling or use really advanced tools," Das says. "There would have to be several levels of maturity and complexity, depending on the employee's skill set. In terms of development, we're not there yet."

In-memory analytics offers subsecond response times and hundreds of thousands of transactions per second. Now falling costs put it in reach of more enterprises. Also in the Analytics Speed Demon special issue of InformationWeek: Louisiana State University hopes to align business and IT more closely through a master's program focused on analytics. (Free registration required.)

About the Author

Kevin Fogarty

Technology Writer

Kevin Fogarty is a freelance writer covering networking, security, virtualization, cloud computing, big data and IT innovation. His byline has appeared in The New York Times, The Boston Globe, CNN.com, CIO, Computerworld, Network World and other leading IT publications.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights