(Practitioner Peter Green of Pfizer asked important questions about where to handle data management before developing the "thin BI" application discussed in this story.)
If you can get your processing power from the database, the crux of the question for BI system buyers is, why would you buy yet more expensive, high-powered servers to run a BI platform? And why would you and take the time and effort to move data into yet another environment?
An Aster Data-Tableau partnership announced last week provides an example of such a combination. Aster Data is an MPP database with a strong-and-growing in-database-analytics story. Tableau provides a popular data visualization environment that is both fast and intuitive for business users. When you put the two together, Aster handles the analytics and querying inside the database while Tableau does the slicing, dicing, data visualization and reporting.
Interestingly, Tableau's recent 6.0 release gives you the option of using a built-in, in-memory database, but Tableau has wisely maintained the flexibility to use its software as a lightweight front-end visualization tool. In this case, lightweight is a good thing, as in minimal processing power and infrastructure required.
With today's databases running on powerful appliances and hardware configurations, it only makes sense to move the processing-intensive query, scoring, and other analytic crunching steps into the database. That's why the likes of Aster Data, IBM Netezza, Teradata and others are moving into in-database analytics.
If we add in the in-memory computing trend -- something SAP, Kognitio, SAS, Microsoft and others are now pursuing -- then BI vendors (even those with their own in-memory products) could focus strictly on usability and let the database support querying and massive data exploration.
We're not quite there yet. But we're seeing the promise of a simplified environment with less redundancy, less cost, better performance and greater usability for all.