By all accounts, the business intelligence (BI) industry is thriving. Numerous reliable surveys confirm that CIOs consistently rate BI a top priority in their plans. Attendance at industry events and vendor conferences is up 10 to 20 percent, and analytics has been featured in top business publications such as The Harvard Business Review thanks to thought leaders like Tom Davenport. The surge in interest is being fueled by the rapidly changing, technology-driven business landscape. Organizations are striving to get "smarter" by forming a deeper understanding of their extended enterprise. That requires intelligence - BI. After decades of being on the periphery of computing, BI is now right in the thick of it.

However, to capitalize on this opportunity, the BI industry must adapt too. Most BI users today are still consumers of information that has been gathered, manipulated and packaged by others. BI deployment practices, and even the BI tools themselves, stratify usage into arbitrary roles based entirely on unchallenged assumptions, not a foundational theory. While the often-stated goal of BI is to present the right information to the right people at the right time so they can make better decisions, the current practice cannot possibly meet those requirements. Decision-making is a more complicated process than just reviewing information. Though the subject of decision-making in organizations has been studied for decades, BI operates without any formal model of its nature or use, leading to arbitrary and ineffective practices and product design.

Is There a Theory of Business Intelligence?

For an industry that is focused on the goal of better decision-making, it is incongruous that so little attention is given to understanding the decision-making processes in organizations. BI seems to operate without any conceptual foundation, just some loose and vague promises about better decisions. There is an extensive academic foundational basis for the "plumbing" of BI, such as software engineering, database design and user interfaces, but the design, support and nurturing of BI environments is a green field. Ironically, the field of decision-making in organizations is rich with material, but connecting this body of knowledge specifically to BI has not been explored.

The late Herbert Simon wrote extensively about decision-making.1 Simon sequenced the process into problem solving and then decision-making as separate parts of the same process. In BI, the universal assumption is that data leads an analyst to a decision, skipping the problem-solving step completely. Most BI tools and methodologies are designed with this model. But when it comes to the actual workflow of analytics, BI is typically used to inform part of the process not orchestrate it.

Simon's conclusions have direct bearing on how analytics should be orchestrated in an organization:

"It is work of choosing issues that require attention, setting goals, finding or designing suitable courses of action, and evaluating and choosing among alternative actions. The first three of these activities - fixing agendas, setting goals and designing actions - are usually called problem solving; the last, evaluating and choosing, is usually called decision-making."2

Most BI tools are not designed for the first three steps - choosing issues that require attention, setting goals and finding or designing suitable courses of action. All of these activities involve working both individually and in collaboration.

Simon adds: "The very first steps in the problem-solving process are the least understood. What brings (and should bring) problems to the head of the agenda? And when a problem is identified, how can it be represented in a way that facilitates its solution? The way in which problems are represented has much to do with the quality of the solutions that are found."

First, there is the issue of which problems (or opportunities) get attention. The next step is representing the problem in a way that others can understand it. BI is not arranged this way today. It may reach a wide audience, directly and indirectly, but its use is tiered. Understanding the dynamics of work is extremely difficult. As a result, simplified models are devised that are logical and compact and have an engineered quality. But people don't operate according to engineering concepts. Organizations are still trying shake off a century of scientific management, or Taylorism, which is an engineered sort of management initiated by Frederick Winslow Taylor in 1911, where authority is hierarchical and job descriptions are strict.

Sothic Analytics

In ancient Egypt, everything was timed around the annual flooding of the Nile. The priests were aware that a year could be measured precisely by the rising of the star Sirius, but because their calendar was 365 days long with no leap years, it would lose a quarter of a day each year. Thus, Sirius would only rise on the horizon at sundown once every 1,460 years, the Sothic Cycle. As the sole keepers of this fact, the rulers were able to maintain their power over the people with their "divinely inspired" predictions of the flood. Farmers, on the other hand, had to wait until they were underwater.

Society was rigidly stratified. At the top level were the royalty and high administrative officials. The middle class was the largest. In it were low-ranking bureaucrats, scribes, craftspeople, priests and farmers. Below that were slaves.

The parallels between Egypt 5,000 years ago and modern enterprise BI deployments are almost comically similar. User roles in BI are consistently depicted with the image of, what else - a pyramid! In many BI implementations, every user of the system is restricted to the data they are allowed to see. With respect to confidential information, privacy regulations or other mandated restrictions, this seems like a reasonable approach, but in most organizations, the "need to know" restrictions are the result of the pyramid, not logic. The eastern region sales manager is unable to see how the western region sales manager is doing with respect to a certain kind of sale and thus, deprived of potentially valuable insight. Or, some super users handling the duties of complex analysis pass it on to the lower levels as distilled pieces of work but without context such as the assumptions employed. Not only is this an incomplete preparation, it can have unintended effects: It is well known in the field of organizational decision theory that decision-makers often overreact to new information, contrary to Bayes' Law, when it is not presented with adequate context. Rather than streamlining the process, this approach can derail it.

The lower levels in the pyramid have a difficult time ever getting the attention of the super users, especially for urgent and one-off problems. Many simply lose interest and go on about their business with the tools at hand, especially spreadsheets. And by slotting people into roles, mobility within an organization is reduced and recruitment for replacements at the super-user level becomes urgent and difficult. Most importantly, it impedes, not supports analytical work and informed decision-making.

Toppling the Pyramid

The pyramid model of BI is completely inadequate for today's world of externalized business, computer-savvy workforces and constant communication. The concepts of hierarchical decision-making and solitary decision-making are simply not tenable in most cases. Problem solving and decision-making happen at every level of today's flattened and distributed organizations. The second word in the phrase business intelligence is, after all, intelligence. What does it mean to provide intelligence to people and operations? How do systems become intelligent? The enemy of intelligent systems and organization is stasis. Becoming intelligent involves collaboration, sharing and the ability to publish and modify analytical applications, not just data.

All four steps of the decision process elaborated by Herbert Simon and many others can be satisfied with analytics software that allows people at any skill level to perform the required operations. Super users will always have a role, and most people in an organization will never develop skills or interest in pursuits such as stochastic processes or simulation, but framing a problem and building a model can be a very simple process. Today, most BI efforts are driven by data, not by models, and the user interfaces, best practices and training are aligned with this approach. Toppling the pyramid means breaking through the data-only model and finding ways to distribute models and applications that can be used and shared by everyone.

Analytics Requirements

Underneath every useful visualization of information lies a model. Manipulating the presentation requires interacting at the model level, not the data level. For example, a visualization of observations over a period of time may allow the viewer to change the time period. This implies that one aspect of the model - time - is exposed. The extent to which this sort of tool is useful to people is a function of how much of the underlying model is exposed in a way that manipulation of the various components of the model are intuitive, simple, repeatable, understandable and reliable. Some of the qualities of a good, interactive analytical tool include:

No code: Any tool that requires programming or the construction of any code, such as SQL, will never be adopted by the community at large.

No scripting: Even simple scripting, typed or created with dialogs, creates unmanageable maintenance problems as the body of analytical models grows.

No cryptic words or phrases to learn or understand: Seemingly innocuous words and phrases to technology developers create a barrier for nontechnical people. For example: dimension, cache, join, etc.

Ability to modify existing models while maintaining the dependency between them: For one person to take a model and make adjustments to it as a separate model, the option must be available to maintain the link so that the dependency is maintained throughout changes over time.

Ability to share models and to perfect them collaboratively: In wiki style, models must be collaborative and not involve keeping local copies of versions.

Ability to selectively expose elements of a model for understandability (to others): A model author should be able to determine which aspects of a model can be manipulated by others and which are frozen (for example, to enforce certain standards or procedures).

Administration of models by stakeholders, not IT or other third parties: Unless analytics are completely controlled by the users of the tools, there will be latency and uninformed standards and controls applied. Use of analytical technology is a management issue, not an IT issue.

Access to data as needed without delay: Analytics requires access to both routine data, such as the data managed by a data warehouse, and data for episodic and conditional issues, both internal and external. The analytics platform must have a simple process of preparing data for use.

Architectural fit: Analytics tools need to fit with overall IT architecture, such as out-of-the-box support for popular and industry-specific data sources, configurability so service providers and users can easily  create guided analytic applications without requiring IT development and security, and administrative features such as the ability to distribute controlled data sets to users without burdening IT.

In Blink, Malcolm Gladwell makes the distinction between how people initially react to something and how they may ultimately feel about it.3 Initial reactions to the television shows The Mary Tyler Moore Show and All in the Family were very negative, but as history reveals, people didn't hate the shows; they were just stunned by how different they were. The conclusion is that first impressions shouldn't be taken at face value - they need interpretation. This is the weakness of technology deployments in organizations, especially in the field of BI and analytics, where adoption can be seen as somewhat optional. After the initial rollout and training, people are left to their own devices. A program to move people past first impressions to a more reality-based assessment of the utility of analytics is needed.

Breaking through the BI pyramid by merely suggesting it's the wrong approach is impossible. The solution is to provide the right approach and allow people in organizations to finally be able to do the work that they've been told they should do - act independently and collaboratively, move with swiftness by being informed and leverage the wealth of technology available today to assist them. Technology and service providers must educate themselves in the realities of problem solving and decision-making and start to deal with the situation as it really is, not as their current tools and approaches presume it to be. That requires jettisoning Taylorism and the complex, layered architectures of their products and methodologies and allowing knowledge workers to finally operate at the highest possible level. 


  1. Herbert A. Simon & Associates. Research Briefings 1986: Report of the Research Briefing Panel on Decision Making and Problem Solving. Washington, DC: National Academy Press, 1986.
  2. Simon.
  3. Malcolm Gladwell. Blink. New York: Little Brown, 2005.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access