I believe that the scorecard and dashboard components of commercial performance management software should have predefined key performance indicators (KPIs). However, for the integrated software component that reports measurements, my belief is the vendor’s software should deliberately come with a limited rather than a comprehensive selection of KPIs that are commonly used by each type of industry. The purpose of providing standard KPIs should only be to accelerate the implementation of an organization’s construction of their scorecard/dashboard system with a jump-start.

 

The reason for not providing a comprehensive and exhaustive list of industry-specific measures is because caution is needed whenever an organization is identifying its measures. Measures drive employee behavior. Caution is needed for two major reasons:

 

  • Measures should be tailored to an organization’s unique needs.
  • Organizations should understand the basic concepts that differentiate scorecards from dashboards and KPIs from performance indictors (PIs).

Scorecards and Dashboards Serve Different Purposes

 

In my December 2005 column on this topic titled “Distinguishing Between Signal and Noise - KPIs Versus PIs.” I expand on that column with this one.

 

The two terms – scorecards and dashboards – have a tendency to confuse, or rather get used interchangeably, but each brings a different set of capabilities. The sources of the confusion are:  

 

  • Both represent a way to track results.
  • Both use traffic lights, dials, sliders and other visual aids.
  • Both have targets, thresholds and alert messages.
  • Both provide linkage or drill down to other metrics and reports.

The difference comes from the context in how they are applied. To provide some history, as busy executives and managers struggled to keep up with the amount of information being thrust at them, the concept of traffic lighting were applied to virtually any and all types of reporting. As technology has improved, more bells and whistles were added – the ability to link to other reports and to drill down to finer levels of detail. The common denominator was the speed of being able to focus on something that required action or further investigation. The terminology evolved to reflect how technology vendors described the widgets that provided this capability – dashboards. As a consequence, both dashboard and scorecard terms are being used interchangeably.

 

Figure 1 illustrates the difference between scorecards and dashboards using a taxonomy. Scorecards and dashboards are not contradictory; they are used for different purposes.

 

Figure 1: Difference between Scorecards and Dashboards Using Taxonomy

 

At the top portion of the figure is the realm of scorecards. Scorecards are intended to be strategic. They align the behavior of employees and partners with the strategic objectives formulated by the executive team. In contrast, dashboards, at the bottom portion of the figure, are intended to be operational.

 

Some refer to dashboards as “dumb” reporting and scorecards as “intelligent” reporting. The reason is dashboards are primarily for data visualization; they display what is happening during a time period. Most organizations begin with identifying what they are already measuring and construct a dashboard dial from there. However, dashboards do not communicate why something matters, why someone should care about the reported measure or what the impact may be if an undesirable declining measure continues. In short, dashboards report what you can measure.

 

In contrast, a scorecard does provide the information lacking in dashboards. A scorecard additionally answers questions by providing deeper analysis, drill-down capabilities, traffic light alert messaging, and forecasting for inferences of performance potential to determine motivational targets. Scorecards do not start with the existing data, but rather they begin with identifying what strategic projects to complete and core processes to improve and excel in. The selection and validation of the correct or best KPIs is a constant debate. Statistical correlation interaction analysis among KPIs can determine the degree of influence and “lift” that various cascaded KPIs have on the higher level enterprisewide KPIs – hence this analysis validates or improves the KPI selection. In addition, this type of analysis can automatically uncover previously unknown statistical relationships that may suggest cause-and-effects and can be used for predictive power. In short, scorecards report what you should measure.

 

Here are some guidelines for understanding the differences:

  • Scorecards chart progress toward strategic objectives. A scorecard displays periodic snapshots of performance associated with an organization’s strategic objectives and plans.1 It measures organizational activity at a summary level against pre-defined targets to see if performance is within acceptable ranges. Its selection of KPIs helps executives communicate strategy to employees and focuses users on the highest priority projects, initiatives, actions and tasks required to execute plans. The adjective “key” differentiates KPIs from the PIs reported in dashboards.

    Scorecard KPIs ideally should be derived from a strategy map rather than just a list of important measures that the executives have requested to be reported. Regardless of whether the Kaplan & Norton suggested four perspectives are used or some variant, scorecard KPIs should have cause-and-effect linkages (e.g., statistical correlations). Directionally the employee-centric innovation, learning and growth perspectives, the KPIs should reveal the cumulative build of potential to realized economic value.

     

    There are two key distinctions of scorecards: 1) Each KPI must require a predefined target measure; and 2) KPIs should be comprised of both project-based KPIs (e.g., milestones, progress percentage of completion) and process-based KPIs (e.g., percent on-time delivery against customer promise dates).

     

  • Dashboards monitor and measure processes. A dashboard, on the other hand, is operational and reports information typically more frequently than scorecards and usually with measures.2 Each dashboard measure is reported with little regard to its relationship to other dashboard measures. Dashboard measures do not directly reflect the context of strategic objectives.

     

    This information can be more real-time in nature, like an automobile dashboard that lets drivers check their current speed, fuel level and engine temperature at a glance. It follows that a dashboard should ideally be linked directly to systems that capture events as they happen, and it should warn users through alerts or exception notifications when performance against any number of metrics deviates from the norm or what is expected.

The caution I have for organizations that are paying more attention to their performance measurements involves 1) the linkage of scorecard KPIs to the strategy diagram (often referred to as a strategy map) and the fiscal budget (as well as rolling financial forecasts); and 2) the linkage of dashboard PIs selected to influence behavior that will ultimately result in achieving or exceeding the KPI targets. Strategy diagrams and the budget are located in Figure 1 and are described below.

 

Scorecards Link the Executives’ Strategy to Operations and Budget

 

A strategy diagram is located in the upper left of Figure 1. The figure denotes that KPIs should be derived from the executives’ strategic objectives and plans. If KPIs are selected independent of the strategy, then they will likely report only what can be measured as opposed to what should be measured. Failure to execute a strategy is one of a CEO’s major concerns. Therefore, KPIs should either reflect mission-critical projects and initiatives or core business processes that must be excelled at. (Hence, there is the need for both project-based and process-based KPIs.)

 

The budget (and increasingly rolling financial forecasts) should be derived from the required funding of the projects (i.e., the nonrecurring strategy expenses and capital investments) and of the operational processes (i.e., the recurring operational capacity-related expenses that vary with driver volumes, such as customer demand).

 

A strategy is dynamic, never static, as executives appropriately shift directions based on their new insights and observations. Reliably accurate forecasting is critical for both strategy formulation and future resource capacity management. Hence, both the KPIs and the necessary funding to realize the strategic plans will continuously be derived from the “living” strategy diagram.

 

Dashboards Move the Scorecard’s Dials

 

The organization’s traction and torque is reflected in the dashboard’s PI measures – the more frequently reported operational measures. Although some PIs may have predefined targets, PIs serve more to monitor trends across time or results against upper or lower threshold limits. As PIs are monitored and responded to, then the corrective actions will contribute to achieving the KPI target levels with actual results.

 

Cause-and-effect relationships between and among measures underlie the entire approach to integrating strategy diagrams (formulation), scorecards (appraisal), dashboards (execution) and fiscal budgets (the fuel).

 

Dashboards and scorecards are not mutually exclusive. In fact, the best dashboards and scorecards merge elements from one another.

 

A simple rule is to use the term “dashboard” when you merely want to keep score as in a sports event, and use the term “scorecard” when you want to understand the context of key scores in terms of how they influence achievement of strategic outcomes. The latter will be fewer in number – they are strategic and carry more weight and influence. The former could number in the hundreds or thousands – you still need a way to focus on the unfavorable-to-target ones fast for tactical action. However, action with respect to a single metric in a dashboard is less likely to change strategic outcomes as dramatically compared to when reported in a scorecard.

 

In general, scorecard KPIs are associated with the domain of performance management. In contrast, dashboard PIs are associated with business intelligence (BI). (More information about this topic can be found in my April 2, 2006 column “How Does Business Intelligence and Performance Management Fit Together?")

 

My interest is that organizations successfully implement and sustain an integrated strategic scorecard and operational dashboard system. Hence I advocate that organizations understand the distinctions described here. This is why I caution against simply using an out-of-the-box list of various industries’ common KPIs and PIs – regardless of their source.

 

Reference:

  1. Wayne W. Eckerson. Performance Dashboards. John Wiley & Son, 2006.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access