Real time, near real time, batch. Organizations need actionable insights faster than ever before to stay competitive, reduce risks, meet customer expectations, and capitalize on time-sensitive opportunities.
CIOs have been pummeled with requests for real-time analytics because people in the organization think they need it -- in marketing, IT, security, fraud prevention, customer support, and other areas -- and some of them actually do need it. In the not-so-distant past, very few reasons justified the expense of real-time analytics, but with the cloud, a new generation of solutions, and open source projects like Apache Hadoop and Spark, the economics have changed. As a result, the scope of the use cases is expanding.
Whether to choose real time, near real time, or batch "depends on the use case and how important it is to get an up-to-the-second response. It's all about the response," said John Bates, CMO and former CTO for intelligent business operations and big data at Software AG, in an interview. "Reports that used to be available at the end of the month or in a week are now available intraday, and then you're getting into 5, 10, 15 minutes. That's fine for people who want dashboards, but if you're doing high-frequency trading or trying to stop a security or compliance threat before it causes damage, it's critical to receive the lowest latency response."
While it's clear that the time-to-insights window is collapsing, it's less clear what individuals or companies mean when they talk about real time and near real time, since the definition can vary depending on the need, the industry, and an individual's point of view. Real time is often defined in microseconds, milliseconds, or seconds, and near real time in seconds, minutes, or hours -- although the definitions can vary even more than that. More important than a universal definition of the categories is the business need, viewed in terms of cost and benefits (usually capitalizing on opportunities, minimizing risks, and satisfying customers).
"We talk about 'just in time' to help [customers understand their] cycles and how they perceive that changing, because it goes to the type of investment," said Keith Collins, CIO at analytics and business intelligence (BI) solution provider SAS, in an interview. "Is it the speed of making the decision, or how fast you want to look at your historical basis? Most people aren't going to change their models quickly."
For one thing, there are hurdles to overcome, such as adopting new technologies, making architectural adjustments, and grappling with data integration issues. Accelerating insights may also require business process adjustments, some of which may be met with resistance by users. The progression to real time and near real time is often gradual, meaning there is a speed improvement of one or two orders of magnitude, such as hours to minutes or seconds, and then further refinement as necessary, driven by business need.
"[Companies] understand the need to analyze real-time and historical data together to identify system or application-level patterns as they occur, as well as to generate meaningful business impact and deliver value to their own customer base," said Ankur Goyal, vice president of engineering at MemSQL, in an interview. "Because end users have come to expect short load times, personalization, and updates in real time, it is vital to replace legacy architectures with a real-time data pipeline to capture, process, analyze, and serve massive amounts of data to millions of users."
In today's dynamic business environment, companies must be more nimble than ever. Here's how the time element is playing out in analytics across enterprises and industries.Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include ... View Full Bio