Shari would like to thank Soumendra Mohanty for his contribution to this month’s column.
Last month, I argued that the concept of business intelligence (BI) 2.0 promised much but without the right approach could fall at the first hurdle - effective data management. Once that hurdle is cleared, what should a company expect to see from a full BI 2.0 program, and what are the essential components required for BI 2.0 to be realized? The most talked-about impact stems from the fact that BI 2.0 is event driven and real time. The data generated is analyzed at the very moment the business event happens. This data is in the form of event streams, messages and alerts. This could include capital market transactions, claims registration, fraud detection and anti-money laundering, product shelf cycle time in larger supermarkets or other information enabling a vast range of business insights. With traditional BI solutions, there is typically a gap between knowing something and doing something about it at the very moment of the event happening, and largely this gap is attributed to “data latency.” While significant improvement on data latency is a much talked-about feature of BI 2.0, it also enables decision agility by being both individual data-centric and aggregate data-centric. By this I mean that while processing transactional data, BI 2.0 proposes to analyze each individual transactional data point against expected results at an aggregated level and provide automated alerts to take remedial actions.
For example, most key performance indicators (KPIs) and metrics give a highly aggregated view of the organization and monitor performance-oriented targeted values that are based exclusively on the past data. However, to understand the gap between aggregated performance measures reported on past data and execution excellence of processes involving individual current data, a more granular level of visibility into business processes in real time is required. In a quote-to-ship scenario, sales KPIs may indicate there is an ever-increasing pipeline, but the company is struggling to ship quickly enough for some reason. These two business functional areas may be handling issues in seemingly disconnected areas, but at an aggregate level, a customer satisfaction-related KPI might be getting impacted negatively.
If we are to analyze this scenario, traditional analysis will take longer to pinpoint the problem area because historical data will only provide a clue about the past performance. To understand the reason, there is a need to drill down into the various business processes and understand each activity in that process. This requires an analysis capability to troubleshoot in real time so that remedial actions can be taken quickly. To do this, the business processes need to continuously analyze the individual data points against set thresholds at an aggregate level. Alerts are raised using predictive models when these thresholds are violated or about to be violated.
One thing that excites many is that BI 2.0 is forward looking. It draws inferences from the current situation and enables the data to be understood in the context of its likely future business impact. This requires a series of decision trees and business rules to support real-time decision-making as scenarios change. Injecting real-time business process visibility is just part of the puzzle. Users want to take a measurement of the processes associated with the metrics as well. It is a real challenge to know the financial impact of a process if you don’t understand what’s happening in that process. You can’t fix what you can’t see, and you can’t improve what you can’t measure. In order to gain insight, the data and the business context need to be preserved, as one event leads to another in the business chain.
Take another example. Picture a company after launching a strategic campaign. During the first few days, sales went up as expected, but at the same time it was noticed that few people from the new targeted customer base bought the products. While the main KPI was “sales figures,” the associated component of “expand customer base” was left to be analyzed later. The results may be a success from a sales perspective but certainly are a failure for the campaign to increase customer base. Analyzing this event with respect to the campaign in a real-time mode would have alerted the business process owners to devise a new mode of communication to the customer base the very next day. From a BI 2.0 perspective, the key benefit realized will be the ability to analyze KPIs and tweak in-flight data and/or historical transactions in association with the underlying processes. BI 2.0 can provide predictive insight by analyzing seemingly disparate data and events in real time.
BI 2.0 is also process-oriented, having the data, the event and the business context all observed together. In order to achieve this, the data attribute and its value should be both intelligent and automatically aligned to changing business contexts. At a base level, traditional BI tools can interrogate historical process statistics such as execution completion times and more business-minded values such as process cost. Advanced process insights might incorporate operational dashboards or alerts so that managers can respond in near real time when certain thresholds are reached. BI 2.0 brings in a
further level of sophistication by having “in process” predictive models correlating business metrics and KPIs across both corporate and process data for wider and more meaningful business insights.
Traditional BI approaches and tools have already proven that they can analyze trends and historical information from individual applications and can provide analytical tools that calculate business performance indicators. However, the byproduct of BI is latency, often with a delay from the time that the data is captured to the time that it becomes available to business users - anywhere from a week to a month or even longer. Today, users are involved in making business decisions that are not just based on historical data, but also with sources of data that are immediate, midterm and long term, all combined into a process that has the ability to identify related conditions across multiple business applications, analyze the data, send alerts (via the methods of your choice) and update each information repository as required - all this in near real time.
BI 2.0 has a lot of potential to help enterprises become intelligent enterprises. A few years back, the themes of informed decision-making and cross-functional data integration were the most prevalent among companies. Even today, daily BI, automated decision-making and real-time analytics are in the early adoption stage. There is no doubt that these themes will be mainstream in the future. The issues are how and when. How should enterprises approach BI 2.0 implementations? When is the right time to do so?
The discernment of business opportunities appropriate to BI 2.0 is difficult but will get better as we learn from more successful examples. Over time, BI 2.0 will be accepted as common practice because the demands of the global economy require ever better ways of doing business. In our world, change may be the only constant. But in our business environment, continuous improvement may be the only viable goal. Implementing BI 2.0 is not an option; it will increasingly become a requirement.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access