As today's announcements from SAS suggest, the future will see analytic applications increasingly pre-integrated and pre-tuned to run on state-of-the-art hardware.
The in-memory financial risk and retail applications announced today by SAS barely scratch the surface of a big-time trend that is changing the face of analysis and decision support. When speed is the name of the game, the only way to go is to handle the processing in solid-state memory. Thus we're seeing all sorts of announcements that are headed in one direction: marrying analytics with fast hardware architectures.
"Everything is moving away from rotational storage toward solid-state, in-memory processing," observes Forrester Analyst James Kobielus. "SAS is simply responding to the never-ending requirements of its huge customer base for faster analytics, more efficient analytics, more real-time analytics and more flexible execution of predictive models."The in-memory-powered applications announced today by SAS, which are due by the end of the year, are just the latest examples fitting a big trend. SAS was already headed in this direction by way of DIY grid offerings and in-database processing partnerships with data warehousing vendors. Here are a few more proof points:
SAP last month announced plans for a High-Performance Analytic Appliance combining in-memory processing and column-store compression capabilities. The appliance is due by year-end 2010, though it's unclear whether that means a detailed product announcement or an actual product release.
Last year Teradata announced Blurr, a solid-state appliance also known as the Teradata Extreme Performance Appliance 4555. There are no spinning disks and the name pretty much says it all.
Oracle introduced Sun Oracle Exadata V2 in 2009. The update delivers faster query speeds by way of Sun's solid-state-memory F5100 Flash Array.
IBM's Smart Analytic System, introduced last year, is all about tuning software to run in optimized fashion on modern hardware. The platform is available with optional data-integration and business-intelligence modules, and there are also industry-specific analytic applications either released or in the works.
Netezza announced plans for advanced i-Class analytic capabilities this week, and rival Aster Data Systems also announced amped up embedded in-database analytics this week. (SAS partners with Netezza, Aster and Teradata on in-database approaches. Tapan Patel of SAS compares and contrasts in-database and in-memory approaches here.)
Long story short, the future will see analytic applications increasingly pre-integrated and pre-tuned to run on state-of-the-art hardware so you can take advantage of brute-force processing power, solid-state memory and massively parallel processing. It's the key to getting fast insight in a world in which complex analyses and big data are increasingly the norm.As today's announcements from SAS suggest, the future will see analytic applications increasingly pre-integrated and pre-tuned to run on state-of-the-art hardware.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.