6 Big Data Advances: Some Might Be Giants
In a full week for big data news, EMC, Intel and Revolution Analytics announcements stand out.
It has been a big week for big data news, with O'Reilly's Strata Conference in Santa Clara, Calif., sparking a bevy of announcements. Three of these developments look like real milestones, solving problems that are truly vexing for the big data community.
Here's a quick rundown of the week's top news items with a bit of context on each development.
More Software Insights
- Using InfoSphere Information Server to Integrate and Manage Big Data
- Why is Information Governance So Important for Modern Analytics?
White PapersMore >>
EMC brings SQL analysis to Hadoop.
Batch-oriented MapReduce processing on Hadoop is complicated and slow, so plenty of companies (including Cloudera, Hadapt, MapR and Platfora) are working on ways to bring familiar, pervasive SQL analysis to Hadoop. A new EMC Pivotal HD Hadoop distribution due out by late March promises to bring the breadth of standard-SQL querying capabilities to the Hadoop Distributed File System (HDFS) by way of EMC's Greenplum database.
[ Want more on the EMC Greenplum announcement? Read EMC Brings Data Analysis Breakthrough To Hadoop. ]
Given its position as the leading Hadoop software and support provider, Cloudera and its Project Impala have heretofore been seen as the leading candidate to bring SQL to Hadoop. But Impala is still in beta. By melding its database with HDFS, EMC is promising to deliver the best of structured querying and familiar BI tools along with the scalability and flexibility of Hadoop.
The details available on Pivotal are sketchy and competitors are raising dark warnings that it's not open source technology and that it will create a redundant storage layer, but those strike me as weak attempts to raise fear, uncertainty and doubt. The real analysis will begin when EMC releases the software and supporting documentation, and we can let customer adoption prove the value (or any yet-to-be known compromises).
Intel throws its weight behind Hadoop.
Who knew Intel has had its own Hadoop distribution for two years? The company revealed this week that it has been collaborating with Yahoo on Hadoop since 2009 and working with a few big companies in China since 2011. With a third-generation Intel Hadoop distribution announced on Tuesday, Intel said it's taking advantage of its Xeon processors like no other software supplier. For instance, Intel is the first to make use of Advanced Encryption Standard Instructions available on its chips, an advance it says will improve data security without curbing performance. And through chip-assisted optimizations of networking and I/O, Intel said analyses that used to take four hours can be completed in seven minutes.
Like Cloudera, MapR and other Hadoop software suppliers, Intel is providing cluster deployment and management software that's unique and proprietary to Intel, but otherwise it said it's contributing its chip optimizations to the open-source community. Intel also announced Hadoop partners including Cisco, Cray, Dell, Red Hat, SAP and Teradata. Pentaho scored a coup in that Intel will bundle its software for data management and integration, along with Pentaho's BI and analytics suite, with its Hadoop distribution.
Despite Intel's reassurances that it intends to cooperate, rather than compete, with the likes of Cloudera and Hortonworks, it's not clear to me how yet another Hadoop distribution won't muddy the market waters. Nonetheless, having a big gun like Intel behind Hadoop is yet another sign that it's a data platform that's here to stay.
Revolution brings predictive analytics to big data.
The advantages of in-database analytics are well know: analysts save huge amounts of time and effort by doing modeling and data-analysis work within the data warehouse or mart rather than moving giant data sets off to separate, typically underpowered analytic servers. Taking advantage of the power of massively parallel processing within the database, analysts can work faster and churn out many more models for fine-grained analysis with less time and effort.
Revolution Analytics announced Tuesday that it's working on in-Hadoop analytics, and it says the same advantages seen in in-database approaches will apply. Plenty of companies are already doing bits and pieces of their analytic work on Hadoop, but Revolution said it will be the first to bring the entire predictive analytics workflow onto Hadoop.
"This will enable companies to build predictive models in Hadoop without having to extract the data, and when it's big data, that's where you really pay a penalty in data-movement delays," said Michele Chambers, Revolution's Chief Strategy Officer, in an interview with InformationWeek. Revolution is accessing data from either HDFS or HBase and is using data-streaming and in-memory processing across Hadoop nodes in a "very different approach than any other vendor has taken." The payoff, said Chambers, will be in convenience and time savings for analysts who would otherwise waste time and effort shuttling data sets from platform to platform.
SAS has previously announced a LASR Server and related Visual Analytics application that can run on top of HDFS, but I have yet to talk to customers who are using that product. It's also unclear to me whether it can run on any Hadoop cluster or simply uses HDFS to create a proprietary data layer that can only be used by SAS tools.
Another twist here is that Revolution Analytics is a supporting partner on EMC's Pivotal HD distribution, which EMC touts as supporting predictive analytics on top of Hadoop. It has yet to be seen whether Revolution's In-Hadoop approach will bring advantages that others can't match.