Last month, I took a deep dive into the supply-side economics of metadata. While the supply of metadata is important, the demand or usage of the information is an imperative for long-term success. Ideally, you would have both a solid demand and supply, but if you had to choose between one or the other, take the demand. Demand is much harder to create, develop and sustain over an extended period of time.Figure 1 presents the supply model from last month as well as this month's demand model.

Figure 1: Supply Function, Demand Function and the Equilibrium Point


Quantity of Usage

"Demand" represents usage that can be translated into information exchange, reuse and risk mitigation. Information exchange would normally be a very difficult thing to measure because we can't really place a probe inside someone's head to see if they learned something or actually put metadata information to use. For most vendor and internally developed applications, the standard for information delivery will be the intranet. The good news here is that most intranet applications can be measured with analysis tools. Here, the application server actually watches the traffic and logs various information about the end-user's behavior. Suppose you have a collection of 10,000 data elements defined throughout your environment. By utilizing these trending tools, you can know exactly which assets are the most frequently viewed or downloaded. In fact, you can rank each and every data asset by a metric of popularity, not to mention the amount of time spent reviewing the asset information or the path in which the end user actually located the asset that could help define a better taxonomy. The second flavor of usage is actual re-use.

Re-using data naming standards, transformation rules or modeling templates is where the rubber meets the road. Not many organizations do a great job of actually tracking the amount of re-use, which could be an indicator of maturity. The final area is risk mitigation, where the information you capture allows you to ensure that the business continues to operate or recover in case of a disaster. Risk mitigation may take the form of records information management or Sarbanes-Oxley regulations.

Of course, everyone wants a mathematical formula for calculating usage. The problem with any universal formula is that no single equation works for everyone. Clearly, re-use would be the most valuable metadata usage component, followed by information exchange and risk mitigation factors.

Price of the Asset

As with the supply function, the price is not really a dollar cost but a measure of pain from the end-user's perspective. This pain can come in the form of not having the right information, inaccurate information or the ability to actually access the repository tool itself. It is a commonly held belief that if an end user comes to the repository and finds inaccurate information, he will never return. While this is clearly not true, it does make a great point about having quality information. The key to quality is not adding resources to check everything. The key is the data quality systems you have in place. Fast food organizations understand that in order to deliver consistent quality, you must ensure you have consistent systems in place for ordering, cooking, service delivery and paying the bill. Systems eliminate the variation of product and service delivery, which leads to higher quality. Wal-Mart has great service, and they have arguably the best inventory management system in the world. The same can be said of UPS; smiles and courtesy are wonderful, but the logistics system is the secret to success.

Usability continues to be a topic of great debate within the world of metadata. I struggle with the lack of focus on this topic at major conferences here in the United States as well as abroad. Perhaps the single greatest inhibitor to extending the value of the repository beyond the technical community is building a user-friendly knowledge store. Being user friendly means that the average user can utilize the application and gain value from the information held within the store itself. Some usability elements should include an easy-to-understand vocabulary, logical flow of information and familiar interface design techniques used today with Amazon, eBay or Dell Computers.

Once again, you should apply the commonsense test to this model. If you could enhance the quality of information, improve the delivery, standardize on the metamodels and improve the ability of the end user to locate and utilize the information, then usage should increase. It's this ever-increasing usage that drives value up over an extended period of time.

Four Quadrants of Metrics

By combining the two articles of supply and demand, I can define four different combinations of value. Imagine a four-quadrant square with supply (content) across the top with high and low designations. On the left side is the demand (usage) classification with the same high/low designations. The first quadrant defines the perfect metadata repository environment where the amount of content and usage is very high. While the amount is important, the growth rate should also be considered. After six months, you should be able to define a growth rate appropriate to your organization. For our organization, we strive for a 15 percent growth rate for the year. Equally important is the trend of growth. You will always have a down month; what you want to understand is the trend that would indicate that you need to take action before the end of the year.

The second quadrant is where your content is high but your usage is too low. This environment spells trouble. The vast majority of organizations are in this boat and don't even realize it. They just assume that usage is present without actually measuring it. The third quadrant can be viewed as an opportunity waiting to be seized. You have low content but usage is high. Here, expansion is the key; while your managers or executives are excited, you should strike hard and fast in order to generate demanded content. The final quadrant is where both areas struggle to gain momentum. In this case, serious consideration needs to be taken to see how long you will be able to operate at a loss where your only value statement comes from risk mitigation.

The bottom line is that supply and demand are just two of the many metrics that can be used to judge the success and failure of a metadata implementation. Most people will wonder why data quality wouldn't be the metric of choice. I agree that data quality is critical, but that's the price of entry. You can't get into the metadata game without data quality, a required ingredient of success.  

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access