The traditional supply curve indicated the lowest price at which suppliers were willing to sell their product. As prices rose, the seller was willing to sell more of the product. The seller in the world of metadata is the producer of metadata information. Those individuals who work tirelessly to create assets and define the metadata information produce the value-add for metadata. Traditional supply curves put price on the vertical axis and quantity on the horizontal axis. Price doesn't make much sense in our world, where we provide products and services inside the organization. However, we can exchange the price component with a cost or value-add one. Does this pass the common-sense test? Let's see; the more value the metadata effort creates, the more producers are willing to turn over the responsibility of metadata management. In a world where business units have the choice of building distributed metadata management environments versus utilizing a centralized metadata management group, the value-add must be substantial.How you define the quantity of assets remains a debatable area. Does a logical model count as one asset or 250 entities and attributes? Assets can always be deconstructed into smaller reusable subcomponents. Some areas, such as traditional database metadata, can generate hundreds of thousands of elements while Web services may produce less than a hundred. That being said, a customer who only spends $100 in your store is still a customer, and an asset is an asset, regardless of the size or value. The key is to be consistent in your accounting methods.

Figure 1: Supply Function, Demand Function and the Equilibrium Point


Changes in value and quantity will simply move you up and down the supply curve while changes in external factors will actually shift the curve. These shifts create value generation opportunities that can substantially lower the costs or increase the asset portfolio. Several factors can shift the curve, including other groups' metadata efforts, improved technology, future expectations from projects and expanded metadata standards. Suppose for a moment that we automated the extraction of metadata from the system development lifecycle (SDLC). The supply curve would shift to the right because I can manage a larger number of assets with the same cost structure. In addition, the curve may actually move to a more horizontal position because the incremental costs are lowered as well.

Perhaps the open question for the supply curve equation is, how do we define value-add? From the producers' or suppliers' point of view, this could be defined as the content value equation. The costs of metadata implementation fall into several high-level categories: fixed, variable, opportunity and sunk. The fixed costs are the ongoing expenditures that do not change, regardless of the quantity of the assets managed. These costs might include salaries, office equipment, software or hardware. Variable costs are based on the quantity of assets governed in the environment. Opportunity costs are associated with what you could have done with the capital. Sunk costs are capital that you have already spent on the effort. The biggest benefits of a centralized implementation include sunk costs such as shared infrastructure, common solutions and a high degree of expertise.

As stated, the costs are the summation of all expenditures associated with the metadata effort. For simplicity's sake, let's assume that we are in our third year of a multiple year implementation, and we are producing a budget for the next year. By adding up the entire cost structure, we come up with a grand total of $550,000. Is that too much to spend on metadata and the utility of asset management? If we also assume that we have a total of 11,750 assets in the collection, then our price per asset is $46.80. Ideally, we want to drive this cost per asset down as low as possible by either increasing the number of assets or lowering the overall costs associated with metadata management. Keep in mind, the costs that we are discussing here relate to the costs of the supplier. What costs would they incur or avoid by doing business with your organization? The Mercury News had an interview with Randy Mott, CIO of HP, in which he commented on lowering the costs of IT. Mott aggressively aims to drive down the company's IT costs from about four percent of revenue to about one and a half percent over the next four years or so.1 That one and a half percent is an excellent number to shoot for in managing the assets of the organization with metadata.

Our goal should be to drive down those costs or increase our asset inventory without increasing the total costs to the supplier. Figure 2 provides a view of the average cost as the quantity of assets increase.

Figure 2: Cost Function of Metadata Value


Notice as the quantity of assets increase, the average cost approaches zero. While this may seem impossible, millions of assets in the enterprise could be tracked with a solid metadata program. Here is one of the major components of the value equation that most IT professionals miss. Once we have the methodology, infrastructure and business processes in place, we can begin to gain those exponential benefits from metadata management with only incremental costs. Metadata is similar to the activity of building software, music and video games in that the first copy may cost millions in development, marketing and design. The cost to duplicate that first copy will only be pennies. This is true up to a point in our world. There is always some element of manual activities that must take place so that unlimited expansion is impossible.

Metadata should drive costs per assets as low as possible. Every year, we set a goal to increase the content by 15 percent and lower costs per asset by 10 percent. Clearly, content inventory is only half the equation, and usage must be considered at every stage of the investment. While 15 percent doesn't sound all that impressive, a 15 percent drop in the cost per asset would result in a 94 percent reduction in just seven years.

Focusing on the supply of metadata information is only half the story. The demand side of the equation is critical to the long-term success of the metadata program. Having a great supply of metadata information is terrific, but no demand will eventually lead to failure. Ironically, having solid demand but low supply creates opportunity. In my next column, I will review the demand equation and usage value generated by the consumers of metadata information.

Reference:

  1. Nicole C. Wong. "Catching up with Randall D. "Randy" Mott: Less is more for HP's Information Chief." The Mercury News, June 2006.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access