Continue in 2 seconds

Trend: Good Enough Isn't

  • August 01 2004, 1:00am EDT

As business intelligence (BI) budgets return to pre-9/11 levels, the sentiment of many shops I come in contact with is the same - "We've been living with this functional BI environment for some time, but it's not good enough for our future." Investments are being made in complete or near-complete "do-overs" of BI environments. This does not necessarily signal that the first (or second, or third) round of BI was a failure. These first attempts were functional for the time period and proved the concept of BI. Now that the taste is good and the concept proved, more is required.

As companies begin to inculcate the strategy of information leadership into everything they do, it can become evident that good enough isn't good enough anymore. Companies do not want to be limited by past thinking and current states of (architectural) affairs. Architectures that were once considered scalable are now considered non-scalable to the degree of data, user, usage and uptime demands that are now typically found in the Fortune 500 BI environment.

Furthermore, if adequate processes were not put in place, expanding the current environment can become difficult, if not impossible. Many implementations go astray because the current staff cannot increment the undocumented BI environments built by previous generations of BI staff. Without processes, users do not have an accurate perception of the outcomes of their requirements and/or when those requirements will be met. User trust in an unpredictable environment with multiple years of development behind it begins to fade.

All of these factors are upon us in BI today and are leading to fresh approaches and opportunity.

Companies are slowly learning to utilize all types of data to their advantage, but that does not change the number of hours in the day, knowledge-worker capabilities or appetite for sorting through data. It does change the volume of data kept. Additionally, it should change our data, how we access that data and what we ask our BI systems to do.

Historical data keeps growing, with age-off of older data becoming a thing of the past. There's also a robust third-party data marketplace that most BI environments have found a way to leverage, bringing hundreds of gigabytes into their data warehouses. Furthermore, as users begin to fully exploit the data that is in BI environments today, more demand is created for other subject areas to integrate, other calculations to be made and more specialized summaries to be created - all driving up data volume beyond what many companies thought when they began their program.

Getting answers to specific business questions will be a staple of BI environments for some time. However, as the only means of data access, it is not scalable to competitive advantage. The trend is exception-based access. Engineering factors that would begin a business action into the BI system saves time and creates consistent processes.

The idea of replicating reports created elsewhere into BI or reporting the basics of sales by product and customer and declaring BI victory is also gone. Because most shops can do this now, this is not creating unique advantage. The storage of massive amounts of data cannot continue to translate into massive amounts of data per report or user viewing. Otherwise, we will never exploit all the data we are storing. Something must change.

We must do zero-based, exception-based thinking around requirements and rely more on our systems to do the exploratory work.

Simply put, we must learn to make meaningful data, rather than raw data, available. We must learn to more fully utilize the basic, inherent capabilities of our software, including the database management systems, with triggers, transformations, alerts, constraints, etc. Having users slog through low-quality data that they then must export to spreadsheets to obtain any real information simply won't make for competitive information advantage. Let the systems do more work. Capturing and incorporating business process knowledge into the systems changes the data that users access in most environments.

These are major paradigm shifts in BI. With all these demands, will the fundamental principle of data warehousing and creating redundant copies of data for a different purpose continue to hold up? I believe the answer is yes for now - as inefficient as it may seem on the surface. Operational systems need a few generations of reengineering before they can play major roles in real-time data integration from points of origination of data. Additionally, with the demand for all data being accessible becoming more real, the weakest operational system defines the overall capabilities of a "virtual" warehouse.

Vendors are responding to the needs with high-speed data warehouse "appliances" and new methods of storage and retrieval of data designed to deal with potentially tens of terabytes of data. However, obtaining the data is not even half the battle. Historically, in data warehousing, data collection precedes a company's ability to completely utilize that data. Changing how we access data and allowing our systems to automate the discovery of actionable information is all part of the current trend.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access