Business intelligence initiatives have been undertaken by organizations across the globe for more than 25 years, yet according to industry experts between 60 and 65 percent of BI projects and programs fail to deliver on the requirements of their customers.

This impact of this failure reaches far beyond the project investment, from unrealized revenue to increased operating costs. While the exact reasons for failure are often debated, most agree that a lack of business involvement, long delivery cycles and poor data quality lead the list. After all this time, why do organizations continue to struggle with delivering successful BI? The answer lies in the fact that they do a poor job at defining value to the customer and how that value will be delivered given the resource constraints and political complexities in nearly all organizations.     

BI is widely considered an umbrella term for data integration, data warehousing, performance management, reporting and analytics. For the vast majority of BI projects, the road to value definition starts with a program or project charter, which is a document that defines the high level requirements and capital justification for the endeavor. In most cases, the capital justification centers on cost savings rather than value generation. This is due to the level of effort required to gather and integrate data across disparate source systems and user developed data stores.

As organizations mature, the number of applications that collect and store data increase. These systems usually contain few common unique identifiers to help identify related records and are often referred to as data silos. They also can capture overlapping data attributes for common organizational entities, such as product and customer. In addition, the data models of these systems are usually highly normalized, which can make them challenging to understand and difficult for data extraction. These factors make cost savings, in the form of reduced labor for data collection, easy targets. Unfortunately, most organizations don't eliminate employees when a BI solution is implemented; they simply work on different, hopefully more value added, activities. From the start, the road to value is based on a flawed assumption and is destined to under deliver on its proposition.

The next step on the road to value for BI is the design and development of the solution. It is important to keep in mind that value can only be defined by the customer. For BI to deliver on this value definition requires the input of the customer in the requirements, design of the solution, and definition of the business rules that enable the customer to turn raw data into actionable information. It's often said that the business understands the problem and IT understands the solution. Unlike most software that is implemented and managed by the IT organization, the vast majority of the data utilized by BI software isn’t captured from users directly. The data is input into business applications, such as ERP, CRM and other business applications and data sources, and utilized by BI software users to generate useful information. Often, the data is integrated from numerous disparate operational systems and loaded into a data warehouse, or multiple data marts, to enable business users to consume data from a central repository and reduce the workload on the operational systems. Other times, the data is consumed directly from the disparate source systems, or replicated copies. Since most of the data emanates from a foreign source, the BI software doesn’t inherently know the meaning of the data. This result is data warehouse data models and BI application semantic layers that are just as challenging to understand as accessing the data from the source systems.

How the customer sees their world is different from IT. This difference is found in the context, which when applied to data provides meaning. Without context, data is just data and is of little value to the customer. Without sufficient involvement by the customer in BI projects, it's difficult to deliver value. However, business users are often reticent to participate for fear of losing their perceived value to their organization. They are also often unavailable to participate due to competing priorities and misaligned goals. In addition, many BI implementation methodologies only involve the users at the beginning and end of the projects, rather than involving them in the design and development of the solution. When left to IT, the design of a BI solution is technically feasible but functionally unusable.

Once the solution has been implemented, the organization is left to manage the existing solution, as well as leading additional iterations. In most cases, a BI department is formed, usually within IT, which possesses the technical acumen on the chosen toolsets as well as a strong understanding of the data. They often find that they spend more than 80 percent of their time maintaining what’s been implemented and have less that 20 available for new projects, and it only gets worse over time. This stems from poorly developed data integration architectures and a lack of program governance.

The challenge with sourcing data from operational systems and integrating data from multiple disparate source systems is formidable, and often laden with technical hurdles. The business users have struggled with this challenge for years and the natural reaction is to include as much scope of source data in the BI project as possible. However, BI implementers learned that you can't try and deliver enterprise BI in one large project as users weren't willing to wait that long for the solution and what was delivered had no longer met their requirements. In response to this challenge, enterprise BI projects are now divided into smaller iterations, which deliver a subset of user requirements in shorter timeframes. While it initially helps solve the challenge of long delivery cycles, most organizations discover that each subsequent iteration add increased maintenance overhead. This increased overhead is caused by:

  • Non-scalable data warehouse and application architectures.
  • Variation in DW data models, data integration mapping designs and semantic layers.
  • Metadata duplicated in multiple software repositories.
  • Disparate scheduling.
  • Non-reusable components.
  • Multiple security architectures.
  • “Plug and play” stovepipe solutions.
  • Non-existent data/systems governance.

The net result is that BI programs spend more time on necessary non-value added activities and less time delivering value to the organization.
Organizations require a better approach to BI, which provides a greater focus on defining and delivering value, as well as principles and practices that help them deliver more, in shorter iterations, with their existing resources. However, delivering more, faster doesn’t necessarily mean better. The solution needs to incorporate the business into the design and development process, align across functional areas, and eliminate unnecessary, non-value added activity. Every BI project and program is an investment that must deliver more value than what it costs to deliver.

One approach is lean BI. My next article, entitled "How to Implement Lean BI," provides a definition and explanation of how lean BI generates customer value.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access