Much of the discussion around complex event processing (CEP) focuses on situation-detection applications: watch for something to happen, and alert me when it does. A common but often overlooked use for CEP technology is delivering real-time insight to allow for better and more responsive decision-making. This goes beyond alerting to helping me understand the current state of my business (or my investments or whatever Im trying to manage) and is a form of business activity monitoring (BAM). It may very well include an alerting component to tell me when something happens, but the main goal is to derive insight from all incoming data and deliver it to users in a way that enables better decision-making.
One example of where this is being used is in the investment community, where there is a desire to see aggregated position information related to the holdings in a portfolio along with profit-and-loss calculations - in real time. That last piece is the twist: traditional position-keeping and portfolio management systems are implemented on relational databases. Relational databases are great for real-time (online) transaction processing, but they arent designed for real-time data analysis tasks. Therefore, in a traditional position-keeping system, individual transactions are applied to the database in real time, but the reports that aggregate the transactions to compute net positions and then apply market prices to compute current value and gain/loss are run overnight or at set intervals. A traditional relational database simply cannot keep up with an incoming stream of transactions and another incoming stream of market prices to continuously recompute the values within the portfolio. For CEP engines that are implemented on a relational model, however, this form of real-time continuous computation is exactly what they are designed to do.
So what is CEP exactly? To put it simply, its a technology for easily implementing applications that analyze moving data to deliver immediate insight and enable instantaneous response to changing conditions. At its core, a CEP implementation is a programmable computation engine that is implemented as an event-driven system. It operates on incoming events that can come from any number of sources. We use the term event for simplicity because what is really arriving is some information about an event that just took place (or is taking place). In the example of a real-time portfolio valuation application, incoming events consist of transactions that add to or subtract from the portfolio holdings and changes in the market prices of securities. These events flow into the CEP engine in real time, and as each event arrives, it passes through the logic contained in the data model that the CEP engine is running, immediately updating any and all result sets that are affected by this new piece of information. Changes to the result sets are then streamed out from the CEP engine to another application or to a dashboard, or used in some other way to either provide high-level information to a user or to trigger an immediate automated response. If the CEP engine also allows the data sets to be queried on demand, it will maintain all current information so that the information is immediately available when a user or an application needs to check the current state of affairs.
Its easy to see how this can be extended to other types of data analysis tasks, to go from reports that tell you how things were to reports that tell you how things are right now. What reports do you regularly run against data in a relational database or data warehouse to aid in your decision-making? How often do you get that report? How old is the data in the report? Would your business benefit if you, or your customers, had access to that information in real time? If the answer to the last question is yes, CEP could very well be the technology that lets you improve the response time of your business.
BAM refers to a class of applications that present a user with a real-time view into their companys business activities such as processes, transactions and operations that span the entire enterprise. This lets the user monitor whats going on around him or her and make timely, informed decisions.
BAM tools have been around for a while, so you might be asking what an emerging technology like CEP has to do with these tried-and-true business monitoring applications. Considering that both technologies can be used to collect information to provide a user with real-time insight, at first glance, they appear competitive. However, CEP and BAM are quite complementary.
Simply stated, BAM is the visible piece of the iceberg while CEP is whats hiding under the surface. Put a little more eloquently, CEP provides the computational engine that can absorb massive amounts of data at very high rates and apply complex analysis logic in real time, producing high-level information that is useful to the user.
BAM tools generally include nice data visualization capabilities along with simple tools for summarizing incoming data. To date, however, they have not had computational engines that can do the heavy lifting. Among other things, while BAM tools present real-time information to the user, theres a wide range in just what is meant by real time. Up to this point, the latency requirements for BAM users have typically been on the order of around 15 minutes. However, Gartner predicts that by 2012 that window will nearly slam shut to a compulsory time frame of less than one minute.1
In addition to the ability to execute with near-zero latency, CEP also differs from traditional BAM tools in the level of complexity it can handle. Most BAM products can do simple summaries of event data, display alerts and graphically depict the state of affairs. They dont have tools to perform complex analysis of the incoming data to do things like real-time correlation, rule-based filtering or the application of complex calculations. Finally, theres the issue of scale: how much data can the technology absorb and analyze, in real time, without getting overwhelmed? High-end CEP engines are designed to be able to process hundreds of thousands of messages per second - far more data than a human can comprehend - which is in part whats behind the need for complex analytics: to distill this flood of data into meaningful insight. The increasing need for lower latency and the users need to absorb and analyze more data coupled with the ability to run complex analytics that derive insight from low-level events will drive BAM applications in the direction of CEP technology.
If CEP is the brain that will underpin the BAM applications of tomorrow, does that mean the current BAM products on the market will disappear? Not likely. The current field of BAM tools provide powerful capabilities around the user interface - facilities to present the information that the CEP engine computes and enable the end user to navigate and digest this information. CEP products to date have not begun to offer the type of data presentation facilities that are present in many of the BAM products on the market. With the integration of a CEP engine and BAM tools, you combine the real-time data absorption and analysis capabilities with strong presentation facilities to interface with the decision-makers.
- Bill Gassman. Gartner Study Reveals Business Activity Monitorings Growing Value. 18 April 2006.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access