Hadoop Adoption Remains Steady, But Slow, Gartner Finds
Despite the big data hype, the adoption of Hadoop remains a slow, steady process, according to new Gartner research. A lack of Hadoop skills and complexity can account for this.
20 Great Ideas To Steal In 2015
20 Great Ideas To Steal In 2015 (Click image for larger view and slideshow.)
Is your company planning to invest in Hadoop in the next two years? Well, 54% of Gartner Research Circle members said no. That also means 46% said yes, but that little inversion is turning into a terabyte of commentary, full of sound and fury, signifying business as usual.
"There is slow, steady adoption," Merv Adrian, a research vice president and analyst at Gartner, told InformationWeek.
Hadoop is transitioning from its early adopter stage to the threshold of mainstream use. But Hadoop's place on the "hype cycle" is pretty much normal. Hadoop is in the "trough of disillusionment," prior to reaching its "plateau of productivity," Adrian said.
Basically, early adoption is giving way to "early mainstream," he said.
Just don't expect the market for Hadoop to skyrocket. With over half of respondents saying they will make no investment in Hadoop for the next two years, demand will be flat and muted.
The last information on Hadoop adoption comes from Gartner's 2015 Hadoop Adoption Study, which the firm released May 13. Gartner received 125 responses from 284 it asked back in February and March about their plans for Hadoop investment. The Gartner Research Circle is composed of IT and business leaders well above the small-and-medium sized business space.
With that said, only 26% of survey respondents are deploying, piloting, or experimenting with Hadoop. Another 11% are saying they will be investing in Hadoop in the next 12 months, followed by another 7% who will do the same in the next 24 months.
In the survey, the "skills gap" was cited by 57% of respondents as a factor inhibiting Hadoop adoption, while another 49% said that figuring out how to get value out of Hadoop was the problem.
It’s not just crafting the algorithm for the query, but knowing what question to ask, Adrian explained. In short, Hadoop is in need of a simpler front-end. Currently, tool sets are geared towards high-end users who really know what they are doing. Guided analytics (AI) and cognitive computing -- like IBM's Watson -- are examples of what is needed.
Right now, not many are connected to Hadoop. According to replies in the Gartner survey, about 20% with investments in Hadoop have six to ten users using it, while another 28% of respondents have single-user systems.
Hadoop allows for distributed processing of large data sets across many computers through the use of a simple programming model. That makes Hadoop ideal for analyzing big data. But the downside is that many of the tools needed to manage and backup such a system are in their infancy, and the simple front-end to permit use by non-experts is not there yet.
[Read about Google and big data.]
In addition, there are many components in a Hadoop solution. "It is a 'takes a village' kind of problem," Adrian said. It is a matter of acquiring systems expertise, since integration with the legacy system is the challenge.
Corporate users understand Hadoop's potential, however.
"You don't want to be the last guy on this," Adrian said. "Lots of folks 'get this.'"
[Did you miss any of the InformationWeek Conference in Las Vegas last month? Don't worry: We have you covered. Check out what our speakers had to say and see tweets from the show. Let's keep the conversation going.]
About the Author
You May Also Like