Operational Analysis is part of the next generation of business intelligence (BI) processes and software. It is focused on applying business performance management (BPM) and thinking more broadly about organizational processes. In order to implement effective operational analysis, it is imperative to use lessons from operations research and process improvement.

 

Using supporting tools and processes available today, BPM has been successful in making the financial disciplines of budgeting, planning and measuring performance against budget a successful and well-supported exercise.

 

BPM builds on BI processes of data integration, data warehousing and data analysis. One of the tenets of operational analysis is expanding BI/BPM within the organization - rather than only allowing upper level management and business analysts to use it. Expanding BI/BPM provides insights directly relevant to those people and their positions in the organization so they can act upon them in a timely manner. Business activity monitoring (BAM) has the same process viewpoint as operational analysis, but emphasizes real-time views of process metrics with triggers and alerts being sent or presented, so that people or systems can react to correct process errors or improve processes in flight.

 

In a broader sense, operational analysis can be seen as supporting operations research, a discipline that has been around for a long time. Here is a timeline of where operational analysis has appeared over time:

 

  • Charles Babbage and the Penny Post in the early 1800s;
  • Scientific management from the early 1900s lead to sales engineering, marketing as a discipline and time and motion analysis;
  • Total quality management in the 70s and 80s;
  • Business process reengineering in the late 80s and early 90s and
  • Six Sigma, and its offshoot, Lean Six Sigma, originated by General Electric.

Process improvement methodologies point to what should be done to analyze, measure and improve processes. Operational analysis, with its BI underpinnings, allows process data to be captured and analyzed in a productive fashion and embeds process metrics into the organization. Often, process improvement projects require the generation and integration of new and existing process data. Without automation, it is back to BI with spreadsheets, which is unproductive and erratic in its results.

 

An example of process improvement with automation is a Six Sigma initiative at a hospital that aimed to improve the flow of stroke patients entering the emergency room. The drug, Tissue Plasminogen Activator (tPA), is highly effective at boosting some stroke patients’ recovery, but has severe side effects. So, a strict diagnosis process, and availability of appropriate staff and equipment is required before tPA can be administered. tPA can only be administered within a short time of the stroke happening, so the process has to be quick.

 

The hospital was seeing stroke patients, but was not administering tPA because of what appeared to be process problems. To diagnose the problem, the process was mapped and tracked, and metrics and goals were established. Data capture from a variety of systems and spreadsheets, data analysis and daily reporting identified blockages in the admission and diagnosis process for stroke patients, and it gave a feedback loop for monitoring and improving the performance of the ongoing process.

 

Without automation, the manual data capture and analysis of the problem would have been difficult, error prone and inefficient (days of effort to produce one weeks’ worth of results). After improvements, the staff could look at summary and detailed information on individual events and patient histories in order to identify and address process blockages associated with administering tPA.

 

This example of operational analysis bears a strong resemblance to a finance focused BPM process, with metrics and goals (budget and key performance indicators (KPIs)) being tracked against the process performance (actuals) and presented in a way that’s understandable and accessible by staff.

 

While the example was of a subprocess of the broader emergency room admissions and diagnosis process, the approach to analyzing the tPA/stroke process could be easily replicated to deal with a broader process using higher-level metrics and using the same methodology, data and tools.

 

So how can you get started with operational analysis? The answer is in the same realm as “how do I get BI/BPM off the ground?”

 

A big bang approach where a process-improvement methodology is imposed across the organization can work, but it is often risky and can lead to mixed results. Training in the process improvement methodology is required, and success in applying the methodology first on a limited scale is often a better approach. The successes provide support in the organization for adopting the process improvement methodology more broadly, and it is imperative that these exercises be driven by the people within the organization who own the processes. They:

 

  • Know their organization and processes,
  • Have current process issues to know where to start,
  • Can quantify the benefits of process improvement,
  • Are effected by the process change,
  • Can drive any process changed due to their level of responsibility and
  • Can devote budget to process change.

The benefits of the process improvement have to be quantifiable, and they form the basis of the KPIs that are to be tracked. In the example, improving the tPA/stroke process was justified by:

 

  • Better patient outcomes (less time in hospital, better quality of life compared to non-tPA patients) and
  • Certification in stroke care by professional bodies and insurers, leading to more patients and higher revenue.

A sales example is improving lead quality to the point that having a sales person involved will increase close rates by a certain significant percentage. Without projection of quantifiable improvements or support by the correct level of process owner, process change and improvements do not happen.

 

The nature of a process improvement initiative is iterative. The analyze-measure-review-repeat cycle is a variation on a BI truism: BI is a journey, not a destination. The cyclical nature is driven by the fact that the organization can:

 

  • Discover other opportunities for improvement or additional metrics than those originally envisaged,
  • Process change cannot happen in one step and
  • Additional process changes or improvements need to occur due to changes in the organization and surrounding environment.

The audience for operational analysis tools is larger than what is typical in a BI/BPM environment. The audience is likely to be less technically experienced than a business analyst or BI user, and so the complaint “BI is hard to use” has to be addressed. Thus, operational analysis presentation tools should include:

 

  • Easy to use function for the audience:
    • simple to use: point and click
    • relevant information pushed to the audience : dashboards, alerts, generated reports;
  • Self-service. Because of evolving processes, the audience needs to be able to create their own views of the information and make them part of the overall information environment for others to use;
  • Data exploration. Allow investigation of the details of metrics, such as detailed transaction views behind summary information;
  • Personalized and secure status. the system is likely to be used by people in different groups and roles, so their views of information have to be controlled in order to restrict irrelevant information or information they are not allowed to see;
  • Low cost per person, due to the broad audience.

The presentation tools of operational analysis must be supported by the availability of integrated, process-related data, which is a data integration effort typical of BI implementations - typically 50 percent to 80 percent of the total technical implementation effort.

 

Process data will come from a variety of systems and data sources and can be at least partially manual, such as in a document or spreadsheet. In order to deliver the complete breadth of process metrics, data that does not currently exist may need to be captured as part of the process. A data integration environment that can evolve as processes change and additional metrics are established is a must.

 

How timely does operational analysis information need to be? It depends on the process and how simple it is to change the process. Data from disparate data sources may not be available all at once, so getting a consolidated set of metrics updated on a minute by minute basis may be impossible. The key question is “How long does it take for people to analyze, make changes to the process or fix errors?”

 

For many processes, a daily or a weekly update is sufficient. The data refresh cannot be left for too long, as people want to be able to see the effect of their changes on the process metrics as soon as possible. Comparison of metrics across time is vital in order to track the effect of process change.

 

Operational analysis is a powerful tool to support organizations monitoring themselves and implementing effective process change. Successful operational analysis implementations in your organization can be helped by an awareness of the theoretical work and experience of operations research and process improvement methodologies.

 

While operational analysis environments will leverage existing BI and BPM tools and experience, new challenges come from serving a broader audience who need easy-to-use tools and a more diverse, quickly evolving data environment than is typical for BI.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access