SAS Prepares Hadoop-Powered In-Memory BI Platform
SAS says high-performance product for business intelligence will bring big-data scale, in-memory speed, and advanced analytic power to ordinary business users.
SAS executives said the planned hardware-ready software package will combine the scalability of the Hadoop Distributed File System (HDFS) and the in-memory processing speed of a RAM-intensive clustered blade server environment. The platform is said to offer rapid analytic insight into the types of routine, ad-hoc questions business analysts typically explore.
More Hardware Insights
- Windows Storage Server 2012 on IBM System x
- BYOD and Windows 7 Migration are the Questions. Is Desktop as a Service the Answer?
White PapersMore >>
- Four Steps to Virtualization Security
- Virtually Protected: Key Steps To Safeguarding Your VM Disk Files
Plenty of BI products can deliver red, yellow, and green key performance indicators (KPIs) on dashboard displays, but the new platform will go further, said Greg Hodges, SAS's product management director for BI products, in an exclusive interview with InformationWeek. "This platform will be able to tell the user why the KPI is red by applying advanced analytics in a way that's easy for business analysts to digest," Hodges said.
[ Want more on Hadoop? Read 12 Hadoop Vendors To Watch In 2012. ]
For example, the platform will be able to find patterns and correlations among data, uncovering root causes without requiring PhD-level analysts, Hodges said. The key to ease of use is an ad-hoc data exploration interface that lets users drag and drop data sets onto a palate for anlaysis. The product then automatically choses the most appropriate chart or visualization depending on the sources and data types selected, Hodges said.
The new high-performance BI platform is expected to be released in the first half of 2012. SAS declined to offer competitive comparisons, but it would seem to answer in-memory competition from SAP Hana and Oracle Exalytics as well as in-memory-powered data-visualization and data-exploration capabilities from the likes of Tableau Software and Tibco Spotfire.
SAS stressed that it will not sell hardware. Rather, the software will be ready to run on commodity blade servers from any hardware vendor. Administrators will have a Web-based interface through which they will be able to point at existing data sources including relational databases and SAS repositories. From there the data is copied into HDFS, which serves strictly as a high-scale data storage layer. When data is requested, it's moved from HDFS into memory on each of the blades in the cluster. SAS calculations are performed in distributed fashion using SAS grid data-processing technology. No separate database will be required, according to Hodges.
SAS declined to divulge product names or details on the Hadoop software running behind the scenes. The new BI platform will extend the SAS high-performance product line, which already includes SAS High-Performance Computing analytics platform, which includes a modeling/analytics workbench that runs on Teradata and EMC Greenplum data warehousing platforms. The line also includes retail big-data price and assortment planning applications that can run on commodity hardware.
It's time to get going on data center automation. The cloud requires automation, and it'll free resources for other priorities. Download InformationWeek's Data Center Automation special supplement now. (Free registration required.)