In-Database Analytics: A Passing Lane for Complex Analysis - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Software // Information Management
News
12/15/2008
12:06 PM
Connect Directly
Twitter
RSS
E-Mail
50%
50%

In-Database Analytics: A Passing Lane for Complex Analysis

What once took one company three to four weeks now takes four to eight hours thanks to in-database computation. Here's what Netezza, Teradata, Greenplum and Aster Data Systems are doing to make it happen.

A next-generation computational approach is earning front-line operational relevance for data warehouses, long a resource appropriate solely for back-office, strategic data analyses. Emerging in-database analytics exploits the programmability and parallel-processing capabilities of database engines from vendors Teradata, Netezza, Greenplum, and Aster Data Systems. The programmability lets application developers move calculations into the data warehouse, avoiding data movement that slows response time. Coupled with performance and scalability advances that stem from database platforms with parallelized, shared-nothing (MPP) architectures, database-embedded calculations respond to growing demand for high-throughput, operational analytics for needs such as fraud detection, credit scoring, and risk management.

Data-warehouse appliance vendor Netezza released its in-database analytics capabilities last May, and in September the company announced five partner-developed applications that rely on in-database computations to accelerate analytics. "Netezza's [on-stream programmability] enabled us to create applications that were not possible before," says Netezza partner Arun Gollapudi, CEO of Systech Solutions. "Our engine for Profit Analytics generates and calls user-defined functions to compute complex functions based on a set of business rules. The resulting data mart build takes four to eight hours as compared to three to four weeks with traditional approaches." Gollapudi adds that even complex and multiple "what-if" scenarios can now be modeled and tested.

Netezza on-stream analytics is the basis for a "modeling server" from partner RateIntegration. The offering enables interactive development of custom, rules-based data transformations, directly executed as Netezza user-defined functions. "One of our telephony carrier-customers provides continuous real-time margin alerts to analysts around the globe, 7x24, based on online analysis of rated call data records from their global network," says Bert Dempsey, vice president for product management at RateIntegration. "Every call in the network is now incorporated into the analysis within 30 to 60 minutes of call completion."

As a result, business-critical pricing and margin analyses, and usage-based micro-segmentation of the subscriber base can be moved online, "instead of being costly, slow, and difficult offline analysis workflows," Dempsey adds.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Previous
1 of 3
Next
Comment  | 
Print  | 
More Insights
Slideshows
10 Top Cloud Computing Startups
Cynthia Harvey, Freelance Journalist, InformationWeek,  8/3/2020
Commentary
How Enterprises Can Adopt Video Game Cloud Strategy
Joao-Pierre S. Ruth, Senior Writer,  7/28/2020
Commentary
Conversational AI Comes of Age
Guest Commentary, Guest Commentary,  8/7/2020
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
Special Report: Why Performance Testing is Crucial Today
This special report will help enterprises determine what they should expect from performance testing solutions and how to put them to work most efficiently. Get it today!
Slideshows
Flash Poll