Complex Event Processing Struggles for Market Definition
Complex Event Processing (CEP) seemed like a no-brainer for broad-market acceptance a couple of years back. Relational data warehouses and conventional analytics have not kept up with the explosive growth in real-time data volumes and the perceived demand for real-time analytics. CEP promised to fill the gap: technology developed for extreme high-volume, low-latency processing demands. Yet two years on, CEP is still struggling for market definition.
Complex Event Processing (CEP) seemed like a no-brainer for broad-market acceptance when I first wrote about a key constituent technology a couple of years back. Relational data warehouses and conventional analytics have not kept up with the explosive growth in real-time data volumes and the perceived demand for real-time analytics. CEP promised to fill the gap: technology developed for the extreme high-volume, low-latency processing demands of capital-market algorithmic trading and communications networks, compatible with emerging service-oriented architectures, applicable to a broad spectrum of security, logistics, and click-stream challenges. Further, CEP is supported by a vibrant, diverse community of academic and industrial researchers. IBM and Oracle and other established companies are doing very significant work, and the field has spawned half-a-dozen start-ups. Yet two years on, CEP is still struggling for market definition outside of capital markets.Last week I caught up with a couple of long-established CEP vendors, Vhayu Technologies and Progress Software, whose respective Velocity and Apama algorithmic-trading products date back to the late '90s. For good measure, I also checked in with StreamBase, a more recent entrant, notable (if for no other reason) for links to DBMS luminaries Michael Stonebraker and Jennifer Widom. The timing coincided with a Gartner CEP event, co-located with their Orlando business process management (BPM) event.
Vhayu's PR firm enticed me with commentary on central Gartner topics: business activity monitoring (BAM) and event-driven architecture (EDA) among them. Yet Vhayu Vice President John Coulter quickly clarified that his company is strong in capital markets and they have no plans to pursue other applications. Vhayu started in equities trading and has expanded into fixed-income securities, options, futures, and foreign exchange. Coulter said that handling different asset classes takes a lot of time and that customers are looking for market-data expertise and not just technology. Coulter said that IT spending by financial-services is high and that his company's revenues have doubled yearly for the last four years. Vhayu will continue to focus on a market they know well.
Apama is the product of research at Cambridge University and has similar roots in capital markets. Supported by Progress Software's 2005 acquisition of the company, Apama has moved over the last year into "mainstream applications such as casinos, retail, and telecommunications network management" as well as BAM. Nonetheless, Vice President John Bates described a vertical strategy similar to Vhayu's: start with capital markets, with equities in particular, and extend to other forms of securities. Progress Apama is taking this approach one step further, branching into customers' risk management and compliance operations and offering solutions for regulators as well as for traders. Bates said that 80% of his company's CEP business is in finance. "Other domains are beginning to understand CEP": only beginning. The company's biggest competitor in all domains is build-it-yourself rather than any other CEP vendor.
Bates says that Apama relies on the same core platform for all applications. "The various edge pieces in the platform enable us to specialize." These "edge pieces" include bi-directional adapters that take in data and emit actions and reusable business modules called SmartBlocks. Interestingly, Progress and Vhayu count four or five shared customers who exploit Apama's business modeling capabilities and Vhayu's capacity to analyze both real-time and high-volume historical data.
StreamBase has more-recent academic origins, yet Vice President Bill Hobbib offers the same customer figures as Progress's John Bates: 80% in capital markets, most of which involve some aspect of trading, some of which involve compliance and risk. Hobbib expects that proportion to drop to 70% a year from now, even though the potential market outside financial applications is huge.
StreamBase's competition a year ago was 98% home-grown system according to Hobbib; the current figure is down to 50%. StreamBase most often (otherwise) competes with Apama, Vhayu, Coral8, and BEA - a mix of specialized and generalized vendors - supporting Hobbib's contention that "unequivocally yes, a generalized platform [like StreamBase's] can do specialized [tasks such as] algorithmic trading really well."
Bottom line is that while CEP supports near-instantaneous processing, broad-market acceptance will take a while. CEP vendors will compete in that broad market with a dizzying array of better established data processing and analysis solutions that support some level of event awareness and also target BAM, click-stream analysis, logistics, manufacturing, and the like. The future I see for CEP is as part of a solution that integrates the technology with complementary software rather than as, in most domains, a free-standing software category.
I plan to look closer at this question in a subsequent blog entry.
Seth Grimes is an analytics strategist with Washington DC based Alta Plana Corporation.Complex Event Processing (CEP) seemed like a no-brainer for broad-market acceptance a couple of years back. Relational data warehouses and conventional analytics have not kept up with the explosive growth in real-time data volumes and the perceived demand for real-time analytics. CEP promised to fill the gap: technology developed for extreme high-volume, low-latency processing demands. Yet two years on, CEP is still struggling for market definition.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.