Vendors have been talking a lot lately about making business intelligence work faster. SAP has rolled out a BI Accelerator; Oracle, its Daily Business Intelligence; and SAS has announced that the latest version of its BI suite runs on dual core processors. We wondered, how far do such measures go toward speeding up BI?
Not too far, according to Forrester Research analyst Keith Gile. "You can have all the cores you want, but if you have a poorly modeled database, your response time is going to be garbage," he points out.
SAP's BI Accelerator pushes querying back into the SAP R3 environment and the Oracle solution similarly sends BI queries to the OLTP system, which is inherently inefficient. "Those environments were not optimized for asking queries," Gile says.
On the other hand, such technology could be used to help software and staff make decisions on the fly; there's a growing trend toward real-time business intelligence. The trouble is, it's hard to achieve--the transaction data must be extremely fresh and it must be delivered to the decision-maker quickly. For many companies, getting all kiosks, branch stations, transaction systems, and so on, well integrated and updated is virtually impossible. Real-time BI depends on a broad set of technologies including ETL, EII, data replication, Web services and adaptors. If all these layers can take advantage of faster hardware, performance might improve.
"But you don't want to use a Ferrari to pull your boat up a mountain," Gile says. "Just throwing hardware at the problem is chasing the wrong thing. It's much better to make sure your data model is optimized, make sure your ETL process is optimized, make sure you reduce latency, and make sure you properly clean and model the data." --Penny Crosman
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.