Software // Information Management
Commentary
8/27/2008
08:10 PM
Roger Smith
Roger Smith
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Micosoft's SQL Strategy For Massive Data Sets

Cloud computing service providers like Microsoft, Google, and Yahoo are all hard at work on a new generation of parallel data processing tools that will make it easier for each company to store and analyze enormous data sets such as search logs and click streams.

Cloud computing service providers like Microsoft, Google, and Yahoo are all hard at work on a new generation of parallel data processing tools that will make it easier for each company to store and analyze enormous data sets such as search logs and click streams.One of the more interesting papers presented at this week's VLDB (Very Large Data Base) conference in Auckland, New Zealand, "Scope: Easy and Efficient Parallel Processing of Massive Data Sets," (PDF), describes one particular parallel data processing tool developed by Microsoft Research that's being used daily inside Microsoft over petabytes of data on large clusters of thousands of commodity servers, including those Microsoft will use to equip the new $500 million data center the Redmond,Wash.-based company is building in West Des Moines, Iowa.

Microsoft's Scope is similar to Yahoo's Pig, which is a higher-level language on top of Yahoo's Hadoop distributed software framework, or Google's Sawzall, which is a higher-level language on top of the MapReduce framework and the Google File System. According to the VLDB08 paper's authors, Ronnie Chaiken, Bob Jenkins, Per-Ake Larson, Bill Ramsey, Darren Shakib, Simon Weaver, and Jingren Zhou, where Pig and Sawzall promote a more functional or mathematical programming style, Scope looks much more like SQL.

"SCOPE has a strong resemblance to SQL -- an intentional design choice. . . Users familiar with SQL require little or no training to use Scope. Like SQL, data is modeled as a set of rows composed of typed columns. Every rowset has a well-defined schema ... It allows users to focus on the data transformations required to solve the problem at hand and hides the complexity of the underlying platform and implementation details."

The language is high-level and declarative so that the Scope compiler and optimizer can optimize Scope scripts and improve them over time. All the hardware and implementation details are transparent to users.

According to the paper, Scope is highly extensible. Users can easily create customized operators, including:

"• extractors (for parsing and constructing rows from a file), • processors (for row-wise processing), • reducers (for group-wise processing), and • combiners (for combining rows from two inputs) ...

[all of which] allow users to solve problems that cannot easily be expressed in traditional SQL."

All companies that operate Internet-scale services have the need to store and process massive data sets, such as search logs, Web content collected by crawlers, and click-streams collected from a variety of Web services. Google, Yahoo, and Microsoft have developed their own systems that support parallel computations over large (multiple petabyte) data sets on clusters of computers. Google popularized the map-reduce programming model, largely taken from the map and reduce functions commonly used in a functional or mathematical style of programming. Yahoo also has a software stack designed for distributed processing of massive data sets. Users write applications in a language called Pig Latin, which is a dataflow language that uses a nested data model. A Pig Latin program is compiled by the Pig system into a sequence of MapReduce operators that are executed using Hadoop, an open-source implementation of MapReduce.

When Pigs Have Wings

According to the Microsoft paper, a MapReduce application written in C++ takes many more lines of code than the corresponding application expressed in Scope; giving an example that requires 70 lines of C++ code but only six lines of Scope code.

Analysis of massive data sets is becoming increasingly valuable for businesses like Microsoft, in order to support new features and do things like improve service quality and detect changes in patterns over time that can detect fraudulent activity. The new Scope (Structured Computations Optimized for Parallel Execution) language is targeted for large-scale data analysis under development at Microsoft. Scope has the advantage of intentionally building on end-user knowledge of relational data and SQL, with some simplifications that ought to make it easier for the company to take advantage of the new parallel processing execution environment the software giant continues to relentlessly build.

Comment  | 
Print  | 
More Insights
The Agile Archive
The Agile Archive
When it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest - July 22, 2014
Sophisticated attacks demand real-time risk management and continuous monitoring. Here's how federal agencies are meeting that challenge.
Flash Poll
Video
Slideshows
Twitter Feed
InformationWeek Radio
Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.