As human beings, we intuitively understand the importance of events. But in an IT environment, understanding business events and analyzing what actions should be taken as a result of a certain series of events are new developments. The logical pathways that lead an application to interpret meaning are still being formed, and the understanding of how certain activities should ensue is growing.
A more complete understanding of event streams comes at the intersection of real-time event-driven architecture and the traditional world of business intelligence (BI). Its the border between the two territories that many businesses are just beginning to cross, and its clear that BI is going to play a key role in predictive business. It does require a completeness of vision and a level of usability, however, that few solutions offer today.
While event processing concentrates on looking for recognized patterns in the real-time event stream, BI tends to focus on historical data, which has often been likened to driving a car by looking in the rear-view mirror. Bring those two worlds together, though, and you have a powerful new capacity for exploratory analysis in real time. And make that functionality easy enough for business users to pick up and play with, and for perhaps the first time, you are handing the tools to the people who really understand whats happening and can therefore react accordingly and plan for what may happen next.
Traditional BI products have often been used as a data source, with business users running reports then dumping the data into Excel for further analysis in an environment with which they feel comfortable. Once they are given an intuitive environment that quickly spots patterns and relationships and learn more about the application they want to monitor, they will then be in a position to interact directly with the data and build the rules themselves.
The next generation of business management must focus on this vision of not only being able to react to todays pressures but also to understand whats coming next and plan accordingly. Some fundamental flaws exist in the approaches businesses are using to build applications that spot patterns in the events around and activities of their customers, employees and suppliers. These come because of the inability to combine the right historical data with real-time events and because most BI tools lack the rich and rapid discovery capabilities needed by business users.
Next-generation BI meshes well with the concept of predictive business because it can improve the quality and richness of what ought to be monitored, integrated and connected, and the patterns that IT and business systems should be seeking. Additionally, when particular conditions show that things are happening, it can be used to discover and diagnose why they are occurring and offer a path to corrective and preventative action. Used correctly, BI can increase our understanding of what has happened and then feed back into the model to form a virtuous circle.
Consider the example of a fraud application for a bank. An event-driven approach today will monitor activity, and if it spots a set of activities that matches a set pattern of fraudulent activity, it will trigger an exception process to manually discover if the activity was in some way illegal. The trouble is that fraudsters quickly change their operating patterns to stay ahead of this kind of detection. And as fraudulent activity increases, so does the number of instances that require manual sorting at the end of the trading period. This quickly becomes prohibitive.
With a much deeper level of functionality, an application can go one step further, but that requires the following key elements:
- BI that takes the data and maps different relationships to spot particular scenarios,
- An integration infrastructure where multiple sources of data can be pulled together, including traditional data warehouses,
- A complex event processing architecture to look for sophisticated patterns of activities,
- A business process management layer to kick off both machine and human-oriented exception processes, and
- Rich analytics capabilities to empower end users to ask and answer almost any question.
BI plays a vital role in generating discoveries in information, new insights, opportunities and risks in various data sources, and then subsequently determines which of those models appeared to be of value. Downstream of the application, it then pools all of the data together from multiple sources to see what has actually happened.
Analytic applications are also able to determine different scenarios that are worth modelling, and by integrating with many different data sources can come up with a set of hypotheses that look for relationships between data and the different patterns that occur which might indicate fraudulent activity. That analysis can be fed into a series of rules that are running on a product line, which are monitored as trading activity happens. When certain conditions are met, the system can trigger a number of exception processes. One of these processes might initiate a series of analysis sessions that examine particular situations to determine whether they were actually fraudulent and then further inform the model.
As the business environment becomes increasingly complex, CIOs are facing a host of serious concerns. Disaster management, once a relatively low priority, has become a necessity. Effective security management has become more and more compelling as destructive viruses and malicious intrusions have multiplied. In addition, enterprises are presented with ever-shorter deadlines for project completion and challenged by the universal scarcity of skilled personnel. The integration of new and legacy applications is still important, but it has been joined by the need to integrate partners, suppliers and clients. Fixing problems immediately is still a goal, but companies are now looking to fix the problem before it happens. Predictive business builds on the ability of real-time business applications to rapidly identify a situation, make a quick decision and then take action, helping companies to improve their ability to service customers and drive profits.
The roadblock for businesses wanting to achieve predictive business is that traditional BI applications are rarely up to the job. They tend to rely on data warehouses or data cubes, and routinely run programs at night to crunch data so that systems can easily and quickly access in preformatted reports online. This is fine in a production reporting environment, where users are allowed to drill into a particular path in the data following logical drill-down paths. Using a sales example, the user might want to look first at national sales, then examine the region, territory and representative sales, and finally drill into an individuals performance. If at any point in that analysis the user wanted to know what was affecting sales or consider other variables that werent prepared for this report, however, he would need to go back to the IT department and ask them to add some data for analysis or reconfigure a new data cube for them. This often causes delays, in the best case for a few hours, but often days or weeks.
Traditional BI systems are architected this way because they assume that users are interested in a certain set of predefined variables and are looking at known issues and questions. In a more complex environment, however, users dont necessarily know the questions they want to ask or the data sources they will want to interrogate. They want to be able to pull different data sources into their analytic application on the fly and easily visualize the data that the system is presenting to them. And they want to do it fast. The predictive business wants to put these tools into the hands of business users in their day-to-day work without having to recourse to the IT department whenever they want to add a new data source. In doing so, businesses can move beyond real-time business and become a predictive enterprise, making pre-emptive decisions regarding potential threats and opportunities and staying ahead of the curve.
IRS Spotlights Credit Card Fraud
When youre working through 12 million credit card records to determine which might be fraudulent, you need some way of qualifying that data into a more manageable quantity for further analysis. That was the challenge faced by Eastport Analytics in its work with the IRS on its Offshore Credit Card Program. The company required a new kind of BI tool that could demonstrate a repeatable process and fit it into a legal framework. In essence, the analysis becomes evidence that can be used to substantiate a case, says Jonathan Adams, principal of Eastport Analytics.
Eastport uses a range of tools in its lab, but found that BI tools with a visual data analytics application has proved particularly helpful in bridging the gap between BI and business process management. Adams says: We deliver insight, context and domain knowledge. Now our analysts can quickly roll around in the data in an exploratory phase where were trying to determine the right combination of those three elements for any given problem.
Eastport created around 40 different guides to capture the analytical knowledge of the analyst and look for specific kinds of behavior. Because it has already created the guide, whenever Eastport gets new datasets from the IRS, the need to create new templates and pay an engineer to write SQL code each time has been eliminated. In addition, analysis takes three to five days as opposed to two to three weeks.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access