Software // Information Management
News
10/17/2011
10:51 AM
Connect Directly
Google+
LinkedIn
Twitter
RSS
E-Mail
50%
50%

3 Big Data Challenges: Expert Advice

The "big" part of big data doesn't tell the whole story. Let's talk volume, variety, and velocity of data--and how you can help your business make sense of all three.

For now, many practitioners and vendors are content to let SQL and NoSQL systems coexist. Most data warehousing platforms and many business intelligence suites now offer integration with Hadoop. So practitioners can do their large-scale MapReduce or data-transformation work in Hadoop, then move result sets into more familiar and accessible data warehousing and BI tools. An Internet marketing firm might use MapReduce to spot Web sessions relevant to an ad campaign from huge volumes of clickstream data, then bring that result set into an SQL environment for segmentation or predictive analysis.

Online retailer Ideeli is applying this blended approach, using Hadoop to store and process large volumes of Web log clickstream and email campaign data and using Pentaho for BI.

The company sets up members-only "flash sale" sites where it sells small quantities of high-fashion items, fueled by email and social media promotions. The sales typically last a day or two before the inventory is gone, and the boutique is taken offline. Ideeli studies Web traffic to understand which of its 5 million members are responding to a campaign, the traits of lookers versus buyers, and so on.

The trouble with an all-Hadoop approach, Ideeli found, was that Apache Hive--the data summarization, query, and analysis tool that runs on top of Hadoop--was too slow, taking several minutes to handle demanding queries, says Paul Zanis, director of data services at Ideeli. The choice of Pentaho for BI is perhaps no surprise, given that Pentaho has support for Hadoop, including the ability to design MapReduce jobs, extract data from Hadoop, and support scheduled reporting and ad hoc analysis from Hadoop tools.

Ideeli is still building the data warehouse it needs to support the new approach, but the idea is to use Pentaho's data-integration software to extract and transform end-of-day batch loads of clickstream and campaign data. From there, Pentaho's OLAP capabilities will automatically generate new cubes for rapid analysis.

"Once that's in place, we'll be able to explore high-level, summarized data within seconds versus trying to run a Hive query, which would take several minutes," Zanis says.

But there's a limitation today on Hadoop and other NoSQL environments: scarce expertise. Schools, vendors, and companies have spent decades teaching SQL, but Hadoop software distributions have only been available since 2009. Efforts like EMC's Hadoop initiatives are aimed at making it easier to deploy and manage big-data-oriented relational and Hadoop environments side by side, but you'll still need Hadoop expertise to deploy and manage that separate environment. Until these platforms gain larger pools of expertise, data management pros will have to find ways to deliver results through fast and familiar tools.

Previous
4 of 5
Next
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Doug Laney
50%
50%
Doug Laney,
User Rank: Apprentice
11/29/2011 | 3:56:41 AM
re: 3 Big Data Challenges: Expert Advice
Great to see the 3Vs framework for big data catching on. Anyone interested in the original 2001 Gartner (then Meta Group) research paper I published positing the 3Vs, entitled "Three Dimensional Data Challenges," feel free to reach me. -Doug Laney, VP Research, Gartner
HM
50%
50%
HM,
User Rank: Strategist
10/19/2011 | 12:38:57 PM
re: 3 Big Data Challenges: Expert Advice
I noticed that you havenG«÷t mentioned the HPCC offering from LexisNexis Risk Solutions. Unlike Hadoop distributions which have only been available since 2009, HPCC is a mature platform, and provides for a data delivery engine together with a data transformation and linking system equivalent to Hadoop. The main advantages over other alternatives are the real-time delivery of data queries and the extremely powerful ECL language programming model.
Check them out at: www.hpccsystems.com
PulpTechie
50%
50%
PulpTechie,
User Rank: Apprentice
10/18/2011 | 3:13:05 PM
re: 3 Big Data Challenges: Expert Advice
Interesting read. Thanks for sharing.
The Agile Archive
The Agile Archive
When it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
Register for InformationWeek Newsletters
White Papers
Current Issue
Video
Slideshows
Twitter Feed
InformationWeek Radio
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.