Continue in 2 seconds

Making Sense Out of the Future

  • August 01 2007, 1:00am EDT

In God we trust; all others bring data. This witticism, attributed to Edward Deming, stands as a motto for a variety of data-driven activities, starting with manufacturing quality and extending from leaders in business intelligence (BI) to laggards. Many conventional business enterprises and industries are collecting substantial, high-quality data about their individual and aggregate customers, products and market dynamics. Those enterprises that actually look at and know how to use the data are playing a different game at a more advanced level than those still relying on gut feel alone.

This is not a new idea. In a survey I did as an industry analyst back in 1999 at Giga Information Group, some 78 percent of enterprise respondents reported they had a data warehouse in production. Today, many of those enterprises are operating a third-generation data warehouse. So while the data is not always represented exactly as it is needed, there is more of it than ever before, and its quality is better than ever. Those enterprises that know how to make sense of the data are gaining business value that was previously out of reach. This making sense is often called analytics, which has reached takeoff speed (see Figure 1).

Figure 1: Importance of Analytic Orientation: High Performers versus Low Performers 2006

At the same time, Philip Howard has laid down a challenge in a recent article "Second Generation BI." Howard calls for a holistic approach to the next generation of BI.1 The premise of Howard's argument is that we are unable to see the forest of business value for the trees of tools and point products. Howard acknowledges that he does not have an answer to the dilemma. If anything, he is a voice crying in the wilderness - make ready the way of next-generation BI! At the risk of mixing the metaphor, he invites suggestions for how to find a way out of the BI jungle, providing only one clue - it might have something to do with architecture. Here is my answer. In one word, the answer is - innovation.

The messy and dynamic BI market confronting companies in 2007 is a function of dozens of trends. The top three includes innovations in: 1) the underlying system capabilities, such as data warehousing appliances; 2) integrating the business activity with IT even though data integration is still incomplete; 3) surfacing business value through analytics (especially delivered in a usable, well-designed format).

In spite of the chaotic market dynamics, BI tools and technologies are easy to categorize by means of an architecture that distinguishes front-end, middle and back-end structures (and related functions). Process-centric solutions such as business process monitoring cut across all these layers and add an essential time dimension to the information supply chain, connecting transactional data to BI decision-making. The information supply chain is rendered dynamic by inclusion of functions of transformation such as extract, transform and load (ETL), message brokering or on-the-fly interactive enterprise information integration. What's really new here? In next-generation BI, the latter infrastructure of information transformation traverses architectural layers - forward and backward in a loop - and will be organized as part of an enterprise service bus (ESB) using service-oriented architecture (SOA).2

Holistic thinking invites another one word answer - SOA. In SOA, all functions or services are defined using a description language, and their interfaces are discoverable over a network. The interface is defined in a neutral manner, independent of the hardware platform, the operating system and the programming language in which the service is implemented. One of the most important advantages of SOA is the ability to get away from an isolationist practice in software development, where each department builds its own system without any knowledge of what has already been done. This silo approach leads to inefficient and costly situations where the same functionality is developed, deployed and maintained multiple times. SOA is based on a service portfolio shared across the organization, and it provides a way to efficiently reuse and integrate existing assets.

This is a straightforward partitioning of the problem. It is a basic principle of architecture that business value migrates in the direction of the user interface. This gathers traditional query and reporting tools together in a competitive landscape with next-generation, metadata-enabled search technologies as well as data visualization and in-line analytics to deliver the business analyst's "ah ha!" moment.

For example, enterprises in property and casualty insurance will use analytics to spot advantages in risk profiles. Media-savvy enterprises will use clickstream data warehousing to transform Web clicks into customers, navigating the media divide between content owners, distributors and aggregators. These will use analytics to build customer loyalty through careful tracking of transactions and point-based reward systems.

Obviously, this list is not complete. However, it adds up to a critical mass of analytic applications across an amazing array of different vertical industries. The conclusion is unavoidable. Those enterprises that are able to exploit the information asymmetries in their market through analytics will enjoy a competitive advantage.

At the back end, data warehousing appliances and balanced appliance-like configurations have traction and will continue the march up market. Enterprises have seen the future of data management - it requires simplification, high performance and added business value.

The data warehousing appliance.'s emergence  has been validated by the market with the March 22, 2007 S-1 filing by the original data warehousing appliance vendor, which has never had a profitable quarter. Nevertheless, startups have gotten traction, turned some prospects into customers and demonstrated that the idea of an open, commodity-based appliance is capable of changing the economics of data warehousing in favor of cost-sensitive buyers. It has done so in favor of buyers across all platforms, even those that are proprietary.  This means the start of the mainstream, middle market where the technology breaks out into the general purpose data management market.

At the same time, it is the beginning of the end for special-purpose, proprietary data warehousing systems, which, henceforth, are being renamed legacy appliances. Thanks to database innovations in standard relational technology, going forward, enterprises will need only one kind of database to perform both transactional and BI processes, though it will be common to continue to implement separate instances for reasons of operational efficiency.

What gets this architecture to the next generation is remixing it with an ESB by means of SOA. It is not enough simply to connect everything to the ESB, though that is surely useful and necessary. In order to succeed, SOA must be accompanied by implementations of metadata to rationalize the semantics (the meaning and business context) of the diverse, heterogeneous data stores that are being connected, compared and managed. This is a work in progress, and the future belongs to those vendors that execute on the roadmap.

One popular megatrend - BI for the masses - will not survive holistic thinking or the deployment of next-generation BI. Be ready for surprises. Simply stated, the masses are not what they used to be. Instead, a diversity of users will make it clear that a next-generation interface is required to deliver on the transformation of raw data into enterprise intelligence. This is the result of a profound realization that generating intelligent information integration at the business level is not a user interface function at all. The real work happens elsewhere - upstream.

The information supply chain will be rendered hyperresponsive to client requirements by means of SOA. Of course, this includes the interface where the value is delivered. A dynamic interface will be able to morph between different classes of users based on profiles, personalization and preferences. Different classes of users will deploy a dynamic, intelligent interface to accommodate their different requirements. Executives will favor scorecards and dashboards, power users will leverage their analytic data cubes and online analytical processing in spite of their latency, specialists will require advanced analytics for predictive analysis, clerk administrators will value standard predefined reporting, and all users will benefit from proactive notification and alerts.

The problem of the proliferation of tools will fix itself as the consolidation of the software market continues. Such proliferation is frequently described by economists as the "outsourcing of innovation." As industry analysts have pointed out, the process is not pretty, and the result can be disruptive and discomforting to established vendors. This is good because it keeps everyone on their toes. Of course, the business risk of orphaned products should be an important part of every product purchase decision. Some large enterprises will decide it is worth the risk and increase their total cost of ownership over the long term. This represents a limit to the holistic approach.

No enterprise that I know of has the luxury of "blowing up" its existing BI systems and leaping to the next generation in one giant step. But even if it did, the next-generation BI would be distinguished by features that sound like a wish list in terms of today's technology and business practices. That is how a holistic envisioning of the future shows up.

At the front end, personalization will accommodate different classes of users by means of a dynamic interface that is able to switch business context based on user profile and metadata that reaches back to authoritative data sources. In the middle layer, semantic consistency and dynamic metadata will enable intelligent information integration. At the back end, autonomic computing will facilitate flexibility, administration, maintenance and a low total cost of ownership of data, regardless of where it is persisted (data warehoused); and throughout the system, the ESB will deploy high-performance networks and grid computing to deliver low latency and near real-time answers and updates for business analytics, operational decision-making and basic BI.

Going beyond next-generation BI to "blue sky" BI - that is, BI beyond our wildest imaginings - something like a "semantic chip" will be required to reach holistic BI. No number of database administrators will be able to rationalize all the data and keep it current. The result will drive all the hard work of reconciling the 10 different meanings of customer and product down to the hardware level where results are nearly instantaneous. This implies solving some wicked problems such as generating original and accurate utterances in natural language, which will still provide the most powerful BI media (and interface). There is no transparent and direct path, even in the imagination, between the current BI market and a semantic chip. Getting there will require developments that are still several breakthroughs into the future, but, thanks to innovations in architecture such as SOA, appliance-like data warehouses and business analytics, we can be confident they are coming. 


  1. Philip Howard. "Second-Generation BI." See Bloor Research, 18 January 2007.
  2. Lou Agosta. "Data Warehousing Raises the Bar on SOA.", May 2006.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access