Larry would like to thank Tim Perry for his contributions to this month's article.

The demand for real-time event analysis is constantly increasing. As consumers, we have high expectations that the businesses we deal with can identify and react to our transactions and issues instantaneously. Consumers and businesses alike are starting to assume that their suppliers are monitoring their activity and will proactively come to their rescue. Organizations are trying to fulfill these demands through building applications that can provide real-time, aggregated transaction information that is systematically actionable, repeatable and can handle the unknown occurrences that will inevitably happen.

In order to achieve this, we focus on evaluating and reacting to the data that is generated from a specific series of events. Traditional delays in getting data from transactional applications into a reporting or analysis repository are becoming less acceptable for our modern real-time demands. We acknowledge that real time means different things to different organizations; real time might be twice daily in some cases. Thus, a variety of solutions have been developed over the past several years to address these demands. Solutions such as business process management (BPM), business rules engines (BRE), operational business intelligence (OBI), DW 2.0, BI 2.0, business activity monitoring (BAM) and the new kids on the block, event stream processing (ESP) and complex event processing (CEP) have been promoted, implemented with varying levels of success and tested for market traction.

Most of these solutions have client success stories and some industry awareness, but few, if any, have really achieved the type of attention expected from a real-time data revolution that this could be. Data warehousing and business intelligence (BI) have always been an alphabet soup of technologies because there has never really been a holistic solution. Data warehouses in a box and virtual data warehouses failed to capture attention, so we have integrated database, extract, transform and load (ETL), online analytical processing (OLAP), reporting and analytics technology to compensate. It seems to me, the new set of BI requirements and, for that matter, customer intelligence requirements will wind up with similar integration challenges.

As this list of requirements and the acronyms that solve these requirements continue to grow, I question whether these are really revolutionary in their own right or whether these solutions are primarily differentiated by marketing slogans used to create new products, increase market buzz and ultimately confuse corporate IT organizations into additional purchases.

ESP and CEP solutions, while not truly the exact same solution, tout a nonlinear, dynamic and closed-loop approach to monitoring streams of data as the stream is being passed anywhere within the IT infrastructure in order to seek out patterns of predefined significance and react to such patterns. More distinctly, CEP is often referred to as the superset of ESP given that CEP claims to handle the interrelationship of events over any period of time rather than the ESP method of event > condition > action. Both of these solutions bring with them new software requirements and even indicate a need for a new architectural approach - event data architecture (EDA).

The fact is, we have been capturing, processing, reporting and acting on events for quite some time. Campaign management tools have had event-based campaigns for a while, even if it is as basic as sending new communications to responders and nonresponders. Are these new products helping the situation or causing new complexity? With complexity usually comes higher cost and lower true ROI.

It seems to me that for a true real-time, event-based analytical solution, you'll need the following components:

  • A real-time streaming infrastructure - very similar to enterprise application integration (EAI).
  • The ability to process real-time data into interesting information - very similar to ETL.
  • An operational data store (ODS) or other database to store real-time and near real-time information.
  • A visualization engine that scales to hundreds and thousands of users - similar to current reporting and dashboard tools or the new dashboard/BI appliances.
  • A rules engine that can automatically react to specific instances.
  • An application integration infrastructure that can integrate real-time analytics into operational systems - similar to any packaged or custom application.

So do you need a new piece of software or a brand new architecture?
There is obvious benefit in these various technology categories. Before buying into any of these paradigms, you should probably ask a few questions:

  1. Do I already have competencies in real-time messaging and streaming? If you do, you may not need an application. If you don't, these products may decrease the learning curve.
  2. Can my reporting infrastructure handle operational BI, scaling to hundreds or thousands of users? If it cannot, these tools may be able to scale without forcing you to be a performance guru.
  3. Can users easily identify or specify events to track? If they can't, these tools may help you identify and monitor events without IT involvement.
  4. What does real time mean to me? How fast do I need to make decisions? Do I have the people or the processes to react in real time?

Remember, because most of us aren't ready to completely automate decision-making with rules technology, a real-time decision-making infrastructure is only as good as an organization's ability to analyze, process and react to the information. 

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access