Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.
Uncover Patterns In Processes
Complex-event processing consists of monitoring the whole set of events that make up a business process and can help companies comply with regulations
July 29, 2005
5 Min Read
"All that information is very reportable, very auditable, rather than searching for it through a set of separate system reports," Sullivan says.
In one case, the system caught a $75,000 duplicate payment. Reconciling such discrepancies used to be the equivalent of a full-time staff person at American Electric. With Oversight performing checks on the systems, it's now a half-time job, Sullivan says. Such precautions also reduce the costs of turning suspected duplicate payments over to an overpayment-recovery firm. Sullivan expects those recovery expenses will decline from $300,000 a year in the past to between $100,000 and $150,000 this year.
The Oversight system also provides greater assurance of meeting Sarbanes-Oxley and other regulations, Sullivan says. In a large IT organization, many staff members might understand the rules of database and accounting systems, and one could manipulate them for personal gain. "With Oversight, I have a system that detects what's going on independent of the accounting or database system. I'm in a much better position if someone in my technology group is manipulating the data," he says.
Complex-event processing also can be conducted using business-process-oriented middleware such as IBM's WebSphere Business Integration Server or Tibco Software Inc.'s BusinessEvents.
To manage its annuity accounts, Guardian Life Insurance Co. of America uses the Transcend system (it was originally sold by TriMark Technologies Inc., which was acquired by PeopleSoft in 1999, and later discontinued). Transcend administers individual annuities by drawing on multiple databases and back-end systems. It performs dozens of checks on new accounts, such as when an annuity is sold by a broker, and connects to a customer's personal-account interface known as My Account Manager.
Transcend uses WebSphere, including Integration Server, to respond to individual customer requests, conduct fund transfers, and allow customers to set triggers that automatically trade an equity when the price is right, says Shelley McIntyre, VP of business technology. Transcend includes an event engine that can immediately apply new rules and execute transactions when clients juggle their portfolios and set new investment goals.
When a stock is approaching a target price, Transcend doesn't just prepare to execute the transaction. It checks to make sure the money is available in the client's account and sends an alert to the customer that a transaction is about to happen. It lets the customer say, "Oops, I changed my mind," says Rob McIsaac, senior business-systems officer for equity and Park Avenue securities at Guardian.
Complex-event processing also is needed in the financial-services industry when customers want to act on a target stock price with the least possible delay. With information flowing into it, a standard relational database can detect when the price of a given stock has reached a prescribed level. But it's difficult for it to detect the instant when the same stock has traded at the same price three times in the last hour, triggering a major trade by an investor, says Michael Stonebraker, a key figure in the development of relational database technology and now chief technology officer of startup StreamBase Systems Inc.
The StreamBase processing engine, which analyzes streams of data in real time, adds a time window to a stream of data and analyzes aggregate data over a given period. The result is complex-event processing closer to real time than data-warehouse systems and other archival systems can perform, Stonebraker says.
Such stream-processing engines will find greater use watching for patterns and key information in streams of radio-frequency identification data and other forms of sensor feedback in supply chains, he says. "We see a sea change coming in microsensor data. Everything will be sensor-tagged, producing a fire-hose [stream] of data."
Stanford's Luckham says stream-processing engines are one means of complex-event processing, but "events created in a distributed enterprise don't come in a nice, orderly stream." Other means of complex-event processing will have to appear, he says. Luckham himself invented a complex-event processing language, Rapide, but it was never commercialized and hasn't been updated for five years. Another declarative, high-level language will have to emerge to handle complex business processes, he says. A business process expressed in such a language would be directly translatable into executable software code.
Business Process Execution Language, backed by IBM and Microsoft, has been adopted as a standard by the Organization for the Advancement of Structured Information Systems. But BPEL deals almost exclusively with turning business processes into Web services, says Jeanne Baker, president of the Business Process Management Initiative, a user and vendor consortium that's merging with the Object Management Group. With its focus on Web services, BPEL has no means of expressing business logic or allowing human actions in a business process, which makes it "challenged" for complex-event processing, Baker says.
"A true execution language would be a first step," she says. Such a language might lead to business-process-modeling tools with rigorous diagrams and precise notations, like software modeling's Unified Modeling Language.
For complex-event processing to become a reality, "we need a bridge between the business-process analyst and the IT staff," to tie it into the software infrastructure, she says.
Business-process automation specialists such as Luckham and Baker say IT vendors are moving the state of the art of complex-event processing technology forward. "But are we moving rapidly enough to keep up with demand?" Baker asks. "No, we're not."
Continue to the sidebar:
Process By Design: Let Users Develop The Workflow
About the Author(s)
Editor at Large, Cloud
Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.
You May Also Like