SAS users can use new links to draw data from and execute commands within key Apache Hadoop components. But that's just the start of SAS' big data plans.
12 Hadoop Vendors To Watch In 2012
(click image for larger view and for slideshow)
Analytics vendor SAS on Tuesday announced a SAS Enterprise Data Integration Server update that adds Apache Hadoop to its long list of supported data sources. But that's just the start of a comprehensive approach SAS has in the works for big data analysis.
Where many vendors treat Hadoop as a connection point to their database, SAS is an analytics and business intelligence vendor that doesn't view any one database or big data platform as the de facto center of analysis. Thus, the new connection between the SAS Access Module and Apache's Hive data warehousing component makes Hadoop one more example in a long list of sources that also includes Oracle, DB2, SQL Server, Teradata, Teradata Aster, Sybase, Netezza, EMC Greenplum, and MySQL.
Everybody can connect to databases, but the point in this case is about bringing a range of sources to bear to feed SAS analytics. In addition, the SAS Data Integration Studio provides a drag-and-drop, visual environment that can execute multi-step data-integration jobs that can call on commands executed within Hadoop. MapReduce, Apache Pig (MapReduce programming language), Hive, and HDFS commands are all cases in point.
"If you're doing ETL to prep data for analytics, one step might call a Pig script that would execute a map job, a next step could apply SAS code that uses those results, and a last step might reformat the data and load it into the data warehouse," explained Mark Troester, SAS IT/CIO strategist, in an interview with InformationWeek.
SAS analysts can apply user-defined functions (commonly known as UDFs) to execute Hadoop commands, but the vendor is also working on ways--expected to bear fruit later this year--to execute SAS analytics within Hadoop's distributed processing environment. That will enable SAS users to exploit Hadoop as an analytic processing platform, with benefits including Hadoop's low cost, high scalability, distributed processing power, and ability to handle diverse data types without conforming to a predefined schema.
It's a sure bet that IBM is working on ways to execute SPSS analytics within its Hadoop-based IBM InfoSphere BigInsights platform, so the race is on to see which vendor makes the earliest and most accessible use of Hadoop.
SAS says its software will work any Apache open source distribution, so Cloudera, Hortonworks, and EMC Greenplum HD are obvious fits. IBM's mostly open source BigInsights distribution should be just as compatible with what SAS has and what it's planning to add. MapR M5 and the related Greenplum MR distribution might also be a fit.
Like other providers of data-integration technology (Informatica and Talend among them), SAS says it offers a fuller data lifecycle management story than database vendors typically provide. That's a reference to SAS Information Management portfolio capabilities, including data quality, master data management, data governance, analytics management, and decision management.
So instead of just getting Hadoop data into a database, SAS brings it into the process of creating and deploying models, integrate the results of a model into an operational [business] system, and supporting real-time analytical decisions within transactional systems.
In the data-governance vein, SAS can apply the same data-lineage and impact-analysis capabilities to Hadoop that it applies to other sources. Data-lineage lets you track down the original source of a particular data point, a feature that's useful in regulatory and compliance scenarios.
"If a financial service needs to prove that it made the right decisions related to risk or unbiased lending practices, they could use our data-lineage capabilities to go back and show where the data came from and exactly how the analytical routine worked," said Troester.
Impact analysis features help you anticipate the extent and related cost and difficulty of data source changes that would result if a given analytic model or algorithm is changed. It's another example of a deeper capability that goes beyond a simple connector to Hadoop. Indeed, Troester says that in SAS' vision, Hadoop is just one more potential source of big data.
Predictive IT analytics can provide invaluable insight--vital if a private cloud is in your future. Find out how in the new, all-digital Predictive IT Analytics issue of InformationWeek. Also in this issue: Randy Mott named CIO of General Motors, how Dell is pushing into the enterprise data center, and eight key features in Windows 8. (Free registration required.)
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest August 03, 2015The networking industry agrees that software-defined networking is the way of the future. So where are all the deployments? We take a look at where SDN is being deployed and what's getting in the way of deployments.