Why did Software AG buy Apama and Tibco buy StreamBase? Perhaps the Internet of Things is giving complex event processing a second chance.
You hear so much about "real-time" performance, you would think it's a pervasive capability. The fact is, real real-time analytic performance, as in millisecond-latency analysis of data in movement, isn't all that common. That's why the acquisitions of Apama and StreamBase, complex event processing (CEP) technologies purchased last week by SoftwareAG and Tibco, respectively, kind of stood out.
Back in 2008, CEP looked like the next big thing in information management. Once the exclusive domain of Wall Street titans and intelligence agencies, CEP was poised to go mainstream, moving off trading floors and real-time threat detection and into transportation logistics, customer experience management and factory floor automation. Or so we all thought.
Like a lot of people, I bought into the promise hook, line and sinker. But then a little thing called the global financial crisis put a chill on technology investment. Suddenly words like "complex" and the promise of exotic performance sounded expensive and excessive. CEP (like SOA) seemed to fall off the up-and-coming technology roadmap, soon to be replaced by cloud, mobile, social consumerization and other trends promising faster and cheaper ways of doing things.
CEP was still humming along on Wall Street, where there was still a healthy profit to be had through a millisecond-advantage in insight, and in the intelligence community, which was still looking for bad guys like Osama Bin Laden. But CEP wasn't much talked about and I wasn't getting calls about new applications and customers.
Flash forward to last week's acquisitions of StreamBase and Apama. Terms of the StreamBase deal were not disclosed, but one industry insider tells me Tibco got a "firesale" price for the small, independent company. Progress Software sold off Apama so it could "focus on providing leading cloud and mobile application development technologies," according to a company press release.
There are signs that big data trend may be breathing new life into CEP.
The Internet of Things is an obvious play for CEP because all those sensors, log files and machine-to-machine connections will obviously spin out a high-volume, highly variable and high-velocity stream of information.
CEP is a step faster than in-memory technologies such as SAP Hana because it lets you detect patterns while the data is still in motion -- meaning while the transactions and events are still happening. You also can commit the real-time analysis to memory for context and analysis. But whether it's fast, like RAM, or slow, like disk, memory is after-the-fact history. With apologies to SAP, CEP is "real, real-time," and in stock trading and intelligence, the millisecond response-time advantage makes a difference.
Tibco already had a CEP engine called BusinessEvents, but it says StreamBase analytical capabilities for data streams will help it "address a growing number of use cases for data in motion -- in financial services and beyond," according to a release on the deal. The beyond part seems to be about providing "an event-based alternative to batch-centric big data architectures" -- a reference to things like Hadoop and high-scale analytic databases.
SoftwareAG is being even more explicit about its plans for Amapa, saying it will "enable customers to fully design, test, monitor and control the industrial Internet."
It's pretty clear that Tibco and Software AG are better positioned to do something with StreamBase and Apama than were the previous owners, and the only choice at this point was to step up investment. For one thing, there are still plenty of legacy competitors out there, including IBM InfoSphere Streams and Websphere Business Events, Informatica Complex Event Processing, Oracle Complex Event Processing and SAP Sybase Event Stream Processor. Microsoft and HP also have CEP, though you don't hear much about it.
If Tibco and SoftwareAG are really planning to go after big data opportunities, and not just bolster their financial services opportunities, they're going to run into a long list of stream-processing upstarts, with most of them emerging from the big data community. Commercial options include HStreaming and SQL Stream while open-source projects include Storm stream-processing software for Hadoop (notably used by Twitter to make sense of the Twittersphere).
Internet giants Google, Facebook and Yahoo have their own variations on CEP to monitor and make sense of their large, fast-moving networks, and they seem to be contributing some of what they learn to open source projects.
So once again we're seeing seemingly promising opportunities for CEP, but it remains to be seen if we'll see the sort of broad adoption across industries that many expected back in 2008. Big data is inflating a market bubble all its own. It's hard to say whether we'll see a soft landing or a hard crash when businesses start looking for solid evidence of returns on their big data investments.
Yesterday’s innovative data center may be today’s money pit. Is it time for a new plan? Also in the new, all-digital Data Center Decision Time issue of InformationWeek: Data center consolidation is tough, as the government's experience shows. (Free registration required.)
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.