We live in real time, minute by minute. News is no longer delayed by days; it is streamed in real time. We bank online and check our real-time balances. We book flights with real-time visibility of seat availability, and we select the seat we want online in real time. All these transactions generate data - lots of data.
To allow us to adapt our business models to today's real-time world, software applications are now built using event-driven technologies. Data moves around in real time over service-oriented architectures (SOAs), using loosely coupled and highly interoperable services that promote standardized application integration.
Yet business intelligence (BI) today has not changed in concept since the invention of the relational database and the SQL query - until the advent of BI 2.0.
BI 2.0 is a term that encapsulates several important new concepts about the way that we use and exploit information in businesses, organizations and government. The term is also intrinsically linked with real-time and event-driven BI but is really about the application of these technologies to business processes.
At the heart of this architecture are events, specifically XML messages. Ultimately, most modern processes themselves are actioned by events. Consequently, when you think about how to add intelligence to modern processes, the humble SQL query looks far from ideal.
The traditional data warehouse has enabled significant advances in our use of information, but its underlying architectural approach is now being questioned. Its architecture limits our ability to optimize every business process by embedding BI capabilities within. We need to look to event-driven, continuous in-process analytics to replace batch-driven reporting on processes after the fact.
In short, how can we build smarter business processes that give our organizations competitive advantage? How can we build the intelligent business?
The Client/Server Legacy
The BI tools most organizations use today were designed to solve a problem that arose in the early 1990s with the spread of the relational database. As more information was stored in databases, simply extracting it became a chore for IT departments because most users weren't interested in becoming experts in writing SQL queries. Getting the data out of databases truly became an end in itself and drove the rise of BI as we know it. Consequently, BI tools today focus on the presentation of data.
As it turns out, though, extracting data that is hours or days old and publishing it into reports, while useful, doesn't provide clear guidance on what users should do right now to improve business performance. As a result, at many companies, BI users don't even review the reports that are sent to them - they relegate them to reference documents. This is often expressed by users who complain that the information arrives too late to be really useful.
Strikingly, this is the antithesis of the real-time, actionable intelligence that many organizations need to provide the quality of service customers demand. At the most basic, such information is a day late and a dollar short in most industries. In retail, for example, three to four percent of potential revenue is foregone due to items not being adequately stocked all the time. The store manager is sent a stock report, but this arrives the next morning, after the close of business and too late to replenish the shelves.
Faster data warehouse queries or prettier dashboard reports — the focus of BI system improvements until now — clearly do not begin to solve the problem because they do not get to the heart of the architectural issue. It is undeniably the case that by the time data has been entered into the data warehouse and then extracted, it is out of date. This isn't a problem for some applications, but it is terminal for those that must run off real-time or near real-time knowledge.
A common misconception is that real-time data isn't needed because there is no way that operations teams could analyze it. This is applying BI 1.0 thinking; simply delivering more reports faster doesn't solve the problem. What's needed is a way to put relevant insight into the hands of operations staff in time to make a difference to day-to-day operations.
Reports are not the optimal deliverable of BI systems. Reports need analyzing and interpreting before any decisions can be made, and there is evidence that users don't look at them until they already know they have a problem. Rather than reporting on the effectiveness of a process after the fact, BI should be used within the process as a way of routing workflow automatically, based on what a customer is doing. In order to do this, you have to not only capture data in real time, but you need to analyze and interpret it as well.
This is essentially event-driven BI - analyzing up-to-the-minute data in the context of historic information - so that actions can be initiated automatically. The data warehouse isn't good at this. Perhaps it is simply being asked to support functions it was not designed to handle.
BI Services Arrive
Over the past few years, companies have started to present their data warehouses as Web services for use by other applications and processes connected by SOA or middleware such as an enterprise service bus (ESB). A fundamental limitation to this approach is that the data warehouse is the wrong place to look for intelligence about the performance of a current process. Real-time process state data, so relevant to this in-process intelligence, is unlikely to be in the data warehouse anyway.
Even layering a BI dashboard onto the data warehouse is inadequate for many operational tasks because they rely on a user noticing a problem based on out-of-date data. Dashboards aggregate and average. They remove details and context and present only a view of the past. Decisions require detail and need to be made in the present.
It's clear that data warehouses will remain, but their role can be clarified as the system of record, as opposed to the only place that BI is done. Reporting and presentation of historical data will continue to be done here - it was designed for that. Given the challenges associated with trying to move to a real-time data warehouse, however, it is clear that information required to support and indeed drive daily operational decisions must come from a different approach to avoid the latency introduced through the extract, transform, load and query cycle.
The Vision for BI 2.0
If the goal of BI 2.0 is to reduce latency - to cut the time between when an event occurs and when an action is taken - in order to improve business performance, existing BI architectures will struggle.
With BI 2.0, data isn't stored in a database or extracted for analysis; BI 2.0 uses event-stream processing. As the name implies, this approach processes streams of events in memory, either in parallel with actual business processes or as a process step itself.
Typically, this means looking for scenarios of events, such as patterns and combinations of events in succession, which are significant for the business problem at hand. The outputs of these systems are usually real-time metrics and alerts and the initiation of immediate actions in other applications. The effect is that analysis processes are automated and don't rely on human action, but can call for human action where it is required.
BI 2.0 gets data directly from middleware, the natural place to turn for real-time data. Standard middleware can easily create streams of events for analysis, which is performed in memory. When these real-time events are compared to past performance, problems and opportunities can be readily and automatically identified.
In order to make a difference to the bottom line, businesses need to make processes smarter. This means either building outstanding ability into automated processes, or providing operations staff with actionable information and changing the day-to-day standard operating procedure to drive data-driven processes. The solution is to leverage the messaging technologies underpinning transactional systems, business process management and SOA, and event-driven real-time BI technologies. These fit together very naturally; you can think of real-time BI as analysis services in an SOA world.
BI 2.0 needs to work with both well-defined processes and less-defined areas. Many processes can't be modeled and explicitly defined using business process management. In fact, the majority of processes today aren't modeled but rather are less explicitly defined. Business users often can't describe their processes accurately enough, and yet operational processes still need intelligence.
BI 2.0 is driven by this need for intelligent processes and has the following characteristics:
- Event driven. Automated processes are driven by events; therefore, it is implicit that in order to create smarter processes, businesses need to be able to analyze and interpret events. This means analyzing data, event by event, either in parallel with the business process or as an implicit process step.
- Real time. This is essential in an event-driven world. Without it, it is hard to build in BI capabilities as a process step and nearly impossible to automate actions. By comparison, batch processes are informational - they report on the effectiveness of a process but cannot be part of the process itself unless time is not critical. Any application that involves trading, dynamic pricing, demand sensing, security, risk, fraud, replenishment or any form of interaction with a customer is a time-critical process and requires real-time processing.
- Automate analysis. In order to automate day-to-day operational decision-making, organizations need to be able to do more than simply present data on a dashboard or in a report. The challenge is turning real-time data into something actionable. In short, businesses need to be able to automatically interpret data, dynamically, in real time. What this means in practice is the ability to compare each individual event with what would normally be expected based on past or predicted future performance. BI 2.0 products, therefore, must understand what normal looks like at both individual and aggregate levels and be able to compare individual events to this automatically.
- Forward looking. Understanding the impact of any given event on an organization needs to be forward looking. For example, questions such as "Will my shipment arrive on time?" and "Is the system going to break today?" require forward-looking interpretations. This capability adds immediate value to operations teams that have a rolling, forward-looking perspective of what their performance is likely to be at the end of the day, week or month.
- Process oriented. To be embedded within a process in order to make the process inherently smarter requires that BI 2.0 products be process-oriented. This doesn't mean that the process has been modeled with a business process management tool. Actions can be optimized based on the outcome of a particular process, but the process itself may or may not be explicitly defined.
- Scalable. Scalability is naturally a cornerstone of BI 2.0 because it is based on event-driven architectures. This is critical because event streams can be unpredictable and occur in very high volumes. For example, a retailer may want to build a demand-sensing application to track the sales of every top-selling item for every store. The retailer may have 30,000 unique items being sold in 1,000 stores, creating 30 million store/item combinations that need tracking and may be selling 10 million items per day. Dealing with this scale is run of the mill for BI 2.0. In fact, this scalability itself enables new classes of applications that would never have been possible using traditional BI applications.
Real-Time, Event-Driven BI
BI 2.0 represents both a bold new vision and a fundamental shift in the way businesses can use information. It extends the definition of BI beyond the traditional data warehouse and query tools to include dynamic in-process and automated decision-making.
In the past, organizations have been forced to rely on out-of-date information and to attempt to fix problems long after they occur. BI 2.0 changes that because it allows BI capabilities to be built into processes themselves - in short, it lets companies create smarter processes.
When BI steps up to identifying problems and initiating corrective actions, not just presenting data, it has definitely evolved. It is ever closer to providing really useful information that can make a difference to the bottom line. Isn't this what BI was supposed to be all along?
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access