As business intelligence (BI) continues to earn a high level of corporate investment, executive dashboards are becoming the most visible component of an organization's overall enterprise data warehouse strategy.

Like instruments in an airplane cockpit, dashboards help executives see the direction they are heading, gain critical insights before serious problems occur and receive proactive warning signals for real-time decision-making. Effective dashboards display the current status of critical business performance indicators, viewed in context alongside historical results and strategic goals. As a result, decision-makers see flare-ups that demand immediate attention as well as organizational trends that deserve long-term course correction.

As dashboard metrics become more familiar, decision-makers can experiment with what-if scenarios and watch the projected impact on other metrics through advanced performance modeling. For example, a consumer goods firm could simulate the impact of a temporary price reduction to study consumer reaction to the lower price, potential responses of competitors, expanded production and distribution requirements and other critical factors hidden within the data.

Unlike pilots, however, executives can't spend years learning how to read their dashboards. Prior to implementation, executives and their IT managers must understand how to design the dashboard for competitive advantage.

Want to maximize ROI and fly high on the executive barometer? The following lessons learned highlight common dashboard pitfalls and how to avoid them by understanding what to measure, how to approach a project and what to look for in technology.

Measurement Pitfalls

Information overload. Too much information in a dashboard can quickly overwhelm users and obscure critical data points. Suboptimal performance metrics should be isolated and flagged as they move out of an acceptable range. Further information should be provided for detailed research once an issue has been identified. A consumer goods company used a control chart to show forecast accuracy by sales territory. It highlighted areas for improvement, and the control parameters could be tightened as improvements were realized.

Functional bias. If a pilot is responsible only for maintaining a certain altitude above sea level, the mountain in his flight path is irrelevant. That's why a well-designed dashboard includes cross-functional metrics, regardless of where the funding originates. For example, financial statements focused primarily on historical facts don't provide proactive warnings. Internal operational metrics, customer satisfaction scores and external market data are all valuable in forming a comprehensive view. A typical solution contains major categories that include pure financial, operational efficiency, customer satisfaction and quality metrics.

Hindsight. Corporate culture may focus on historic measurement trends because they are easy to pull from existing systems. While learning from experience is important, opportunity for improvement lies ahead with a focus on leading indicators, such as customer satisfaction, quality returns, order fulfillment backlog and economic projections. A manufacturing company went from displaying "sales year to date" to a forward-looking "daily sales required to achieve target" to allow sales staff to focus on results they influence.

Limited perspective. Effective dashboards can't focus purely upon current metrics or historical trends. They need to provide a combination of both in comparison with strategic goals to avert imminent dangers and also identify chronic trends. Current statistics should be as real time as feasible. Waiting for a financial close to get accurate inventory figures restricts performance improvements. Shifting toward daily metric publication has allowed organizations to streamline their periodic financial processes as they focus more upon external reporting.

Inconsistent definitions. It sounds like an easy step, but gaining consensus on the definition of each metric can be one of the most challenging steps in a dashboard project. This solution is bound for great visibility, and it is best to negotiate to a unified concept rather than publish inconsistent results. Be certain to include a facility, such as a hyperlink from the cockpit to an intranet Web page, where stakeholders can view a precise definition of all final metrics.

Irrelevant metrics. Considering that there is a finite number of metrics that can be used effectively, it is unwise to waste valuable space with uncontrollable metrics. Always remember the audience. A cost center manager doesn't have control over employee benefits expenses, so those expenses should be omitted from his dashboard; however, the human resources department may be very interested. Two types of metrics should be included - those that can be directly controlled by the audience and those that may externally influence their operations. Base criteria on normal metric results so extraordinary results can be flagged for immediate attention.

Approach Mistakes

Strategic confusion. Where does the organization want to go, and how will it get there? Dashboards built without first revisiting corporate strategy, goals and mission statement could be counterproductive by influencing unwise decision-making. Any dashboard should echo the strategic direction and priorities of the organization.

Destination unknown. Strategy dictates which direction to fly, but consider benchmarking to develop short- and long-term goals to mark milestones along the way to the desired destination. Business planning translates these goals into challenging yet achievable targets against which to measure actual performance. This approach supports efficient decision-making as actual results are consistently tracked against forward-looking targets in addition to historic performance.

No executive mandate. Without a strong executive champion, a dashboard project is doomed to fail. Other executives and middle managers may ignore the metrics or even undermine their significance in order to make the solution irrelevant. Before proceeding, have the right people on board for political support.

Misalignment. In order to fly the plane efficiently, all propellers need to be turning in the same direction. Given a degree of healthy discord within every organization, competition and disagreement will surface among directors. When portraying the performance of an overall organization in a series of graphics originating in different functional areas, the executive champion should facilitate the alignment of these cross-functional interest groups, including prioritizing metrics, agreeing to standard definitions and determining how they interact.

Big bang. Executive satisfaction will be low if a complete dashboard is promised within a single project phase. The phase will take too much time and requirements will be incomplete as further considerations become apparent after the initial cockpit is delivered. Rather than setting expectations too high and failing to deliver, plan the project to include several iterative phases. This reduces the elapsed time and allows forward movement despite changing requirements.

Incorrect results. When project timelines get tight, thorough testing is often a casualty. However, publishing an incorrect result in a dashboard is solution suicide. Sufficient time must be built into the project plan for data validation by those who understand it best. Communicate a need for this commitment early in the project so resources can be dedicated and scheduled. Facilitate these data validations and use best practices such as acceptance sign-off to ensure airtight quality.

Unguided exploration. Not everyone is comfortable browsing through graphical metrics, researching trends and determining cause-and-effect relationships. Be certain to provide ample training. Focus not only on how the solution toolset works, but also on the source and intent of metrics being presented. An apparel retailer provided a family tree graphic showing the origin and transformation of data supplying the dashboard. For many users, this may be their first exposure to measures from outside their own functional areas. Also provide easily accessible online documentation.

Technology Red Flags

Complex user experience. Executives with little time to learn new technology need a simple, concise interface. Leave the complex navigation options to power users tasked with investigating detailed issues. Use Web-based solutions with minimal navigation buttons and single sign-on to make accessing the dashboard pain-free.

Only ad hoc research. Ad hoc research capabilities must be available, but they should be limited to solving exceptional situations. Common issues should be addressed by guided procedures that lead users to successful resolution consistently and efficiently through limited hyperlinks and simple question-and-answer dialogs. The development of these guided procedures identifies opportunities for process improvement. Focus research efforts where they earn the most value and enable new employees to climb the learning curve quickly. The guided procedure should go as far as necessary to ensure issue resolution - this may lead to adjustments in the operational system, such as purchasing or production schedules.

No knowledge retention. Once an issue has been identified and researched, don't allow the knowledge gained to remain locked away in a chain of emails. Provide documentation facilities that attach narrative explanations directly to the dashboard results. Be certain these explanations are accessible and retained by all users. This information also supports future business planning.

Inflexible. Requirements will change throughout the project and indefinitely. Use a technological backbone that facilitates adaptability. Retain detailed information in case metric results need to be recalculated. Use cross-reference tables instead of stamping characteristic relationships in stone. Provide administrative users the ability to maintain these relationships themselves.

Segregated. A standalone system will cost more and provide less satisfactory experiences to users. Consider solutions that tightly integrate with existing systems. Use the same platform to perform as many enterprise performance management tasks as possible - including financial reporting, business planning and scenario modeling to answer what-if questions.

Dead-end audit. Mistrust of results will arise if an audit trail is incomplete. Provide a consistent, easy-to-use audit trail back to detailed transactions. Highlight any data transformations that occur on the way to the data warehouse. Document and automate regular tie-out processes to instill trust in the solution accuracy.

Outdated information. Processing information in batch mode is still a desirable approach for performance considerations. This may be adequate or necessary for many metrics, but near real-time data load capabilities should be accommodated in support of critical performance measures. When these measures range out of bounds, the technical solution should support pushing customized alerts out to decision-makers via pager or email.

Executive dashboard projects will be judged either as complete successes or complete failures. Tip the scales in your favor by engaging strong executive participation and maintaining ongoing interest through aggressive, iterative realization of a technically flexible solution. With an effective dashboard, organizations can fly safely and efficiently to their strategic destinations.  

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access