Continue in 2 seconds

Reports of the Demise of Meta Data Are Premature

  • March 01 2001, 1:00am EST

Reports of the demise of meta data are premature. It is now old news (September 25, 2000) that the Meta Data Coalition (MDC) closed up shop and merged with the Common Warehouse Metamodel (CWM). However, the glass is not half empty; it is half full. This is progress. This is not a breakdown ­ nothing went wrong here. No one needs two standards. Meta data consolidation will lead to useful results from which end users will see incremental tool and system interoperability, gains in economies of scope, and improvements in and reductions of coordination costs. The technology is still alive and well (or at least kicking vigorously); and, yes, XML will show up like the cavalry to rescue the tool interoperability wagon train.

Though this is not a tutorial on the two standards, context is required. Readers may recall that the two major differences were the repository metamodel and the meta data interchange standard. The CWM exploited the meta object facility (MOF) of the object management group (OMG) instead of the proprietary (i.e., patented) Microsoft Repository Technology Interface Model (RTIM). The CWM made use of an innovative idea of streaming XML meta data interchange (XMI) instead of pre-XML meta data interchange, sometimes also called XML interchange format (XIF). Otherwise, a strong case can be made that CWM was a superset of MDC and included such elements as the MDC's open information model (OIM), with the OIM's relational schema, data cube and the extract, transform and load (ETL) process. The CWM took the latter and added hierarchical, nonrelational data stores such as IBM's IMS and Hyperion Essbase (arguably already included in the data cube) as well as scheduling processes and business procedures as further forms of meta data. In this context, it should be noted that "super set" means "containing functionally equivalent classes or mechanisms to which the subset can be logically reduced" because strictly speaking, OIM is now obsolete.

At the same time, the meta data technology continues to play catch up with end-user requirements for all manner of data, application and system integration ­ architecture, database, workflow, B2B, you name it. Meta data remains a genuinely difficult problem. This is not anything going wrong ­ this is the complexity of technology. If we could solve the problem of meta data, then we would know how to make systems interoperate. This requires step- by-step progress, not a single act of breaking open the shrink wrap.

In conversations with Oracle representatives about the advanced meta data services of Oracle Warehouse Builder (OWB), they indicated XMI would be the interchange mechanism going forward. A single step of progress is in the offing as Oracle continues to integrate the acquired legacies (in the better sense of the word) of the OneMeaning Repository and the Carleton ETL suite of tools. In particular, OWB has bridges to Discoverer, Express and Designer. As Oracle further integrates the OneMeaning Repository, bridges to such design tools as Power-Designer and Erwin will become available using streaming XML based on the XMI to exchange content between tools and local repositories.

Meta data technology is still very much alive; and, as indicated earlier, a particular implementation of XML looks rather like the cavalry to the rescue of meta data in this context. The strategy is to use any meta data repository or tool that can encode and decode XMI streams to exchange meta data with other repositories or tools with the same capability. Thus, there is no need for products to implement the MOF-defined CORBA interfaces or even to "speak" CORBA at all. So yes, XML does represent a technology for defining the interoperability of tools and technologies.

However, the market is practical; and ETL vendors such as Informatica, Informix/DataStage, ETI and Sagent have stolen a march on the standards-based approach. They have captured an important position in the meta data narrative by being at the point where transactional data is mapped to decision support data warehouses. They, too, are working on XMI-oriented meta data processes. The intensity of the work is gated by such bootstrap effects as lack of other tools with which to exchange meta data. Naturally, a standards- based approach, such as Oracle is announcing, is precisely the sort of approach needed to act as a catalyst to create a network of interoperable tools. In the meantime, end users will continue to be challenged by the prospect that everything needed seems to be (as usual) shipping in the next release.

Even though the standard process is never as quick as required, the CWM specification is arguably complete enough to require vendors to implement it. XMI is a method of protecting investments already made in ETL, design and data presentation tools. The recommendation to clients is to grill their tool vendors at the front, middle and back ends about the plans to accommodate such standards-based meta data interchange. The resulting tool interoperability will mean both more end- user options and productivity. This win/win cycle is the whole point of meta data.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access