SAS Taps Hadoop For Big Data Analytics
SAS users can use new links to draw data from and execute commands within key Apache Hadoop components. But that's just the start of SAS' big data plans.
Where many vendors treat Hadoop as a connection point to their database, SAS is an analytics and business intelligence vendor that doesn't view any one database or big data platform as the de facto center of analysis. Thus, the new connection between the SAS Access Module and Apache's Hive data warehousing component makes Hadoop one more example in a long list of sources that also includes Oracle, DB2, SQL Server, Teradata, Teradata Aster, Sybase, Netezza, EMC Greenplum, and MySQL.
More Software Insights
- Rising to the challenge of RMORSA - leveraging existing capabilities and building new ones
- BYOD and Windows 7 Migration are the Questions. Is Desktop as a Service the Answer?
- Don't Get Stuck on Your Virtualization Journey: Where to Focus Next
- 5 Ways To Solidify Your APM Strategy With SaaS
Everybody can connect to databases, but the point in this case is about bringing a range of sources to bear to feed SAS analytics. In addition, the SAS Data Integration Studio provides a drag-and-drop, visual environment that can execute multi-step data-integration jobs that can call on commands executed within Hadoop. MapReduce, Apache Pig (MapReduce programming language), Hive, and HDFS commands are all cases in point.
[ Want more on SAS Hadoop plans? Read SAS Prepares Hadoop-Powered In-Memory BI Platform. ]
"If you're doing ETL to prep data for analytics, one step might call a Pig script that would execute a map job, a next step could apply SAS code that uses those results, and a last step might reformat the data and load it into the data warehouse," explained Mark Troester, SAS IT/CIO strategist, in an interview with InformationWeek.
SAS analysts can apply user-defined functions (commonly known as UDFs) to execute Hadoop commands, but the vendor is also working on ways--expected to bear fruit later this year--to execute SAS analytics within Hadoop's distributed processing environment. That will enable SAS users to exploit Hadoop as an analytic processing platform, with benefits including Hadoop's low cost, high scalability, distributed processing power, and ability to handle diverse data types without conforming to a predefined schema.
It's a sure bet that IBM is working on ways to execute SPSS analytics within its Hadoop-based IBM InfoSphere BigInsights platform, so the race is on to see which vendor makes the earliest and most accessible use of Hadoop.
SAS says its software will work any Apache open source distribution, so Cloudera, Hortonworks, and EMC Greenplum HD are obvious fits. IBM's mostly open source BigInsights distribution should be just as compatible with what SAS has and what it's planning to add. MapR M5 and the related Greenplum MR distribution might also be a fit.
Like other providers of data-integration technology (Informatica and Talend among them), SAS says it offers a fuller data lifecycle management story than database vendors typically provide. That's a reference to SAS Information Management portfolio capabilities, including data quality, master data management, data governance, analytics management, and decision management.
So instead of just getting Hadoop data into a database, SAS brings it into the process of creating and deploying models, integrate the results of a model into an operational [business] system, and supporting real-time analytical decisions within transactional systems.
In the data-governance vein, SAS can apply the same data-lineage and impact-analysis capabilities to Hadoop that it applies to other sources. Data-lineage lets you track down the original source of a particular data point, a feature that's useful in regulatory and compliance scenarios.
"If a financial service needs to prove that it made the right decisions related to risk or unbiased lending practices, they could use our data-lineage capabilities to go back and show where the data came from and exactly how the analytical routine worked," said Troester.
Impact analysis features help you anticipate the extent and related cost and difficulty of data source changes that would result if a given analytic model or algorithm is changed. It's another example of a deeper capability that goes beyond a simple connector to Hadoop. Indeed, Troester says that in SAS' vision, Hadoop is just one more potential source of big data.
Predictive IT analytics can provide invaluable insight--vital if a private cloud is in your future. Find out how in the new, all-digital Predictive IT Analytics issue of InformationWeek. Also in this issue: Randy Mott named CIO of General Motors, how Dell is pushing into the enterprise data center, and eight key features in Windows 8. (Free registration required.)