Continue in 2 seconds

Two Compelling Metadata Strategies

  • February 16 2006, 1:00am EST

During the past six years, our organization has focused on delivering enterprise metadata under a strategy that has served us well. This month I want to do the "Full Monty" and provide a deep dive into the specifics of our strategy. For many years, I struggled with just how to describe our model and many times failed to communicate it appropriately. In the December 2005 issue of Harvard Business Review, Geoffrey Moore describes two compelling business strategies that businesses such as IBM, Intel, Amazon, and many others perform. The complex solution model creates value by providing a custom solution while the standardized model creates value by serving a large population of users. Figure 1 provides a view of these two strategies with several modifications from the author's original article that make it easier to apply to the metadata world.

Figure 1: Complex and Standardized Metadata Solutions

Complex/Custom Metadata Solution

This strategy should look very familiar to everyone because the vast majority of metadata implementations are designed around this kind of framework. In fact, our original charter was to implement metadata in this very fashion. The complex strategy focuses on a single customer, group, project or program. For example, suppose you are building an enterprise database which will be fed by multiple applications and, in turn, feed several data warehouses. You will focus your metadata effort on this single project as well as on a specific set of users. In most cases, the users would be the data stewards, data modelers and high-level business analysts. Moore used a pyramid to make his point of a single customer focus whereby the products and services flowed upward. This type of implementation requires a heavy dose of consulting services, which would make sense because the solution is highly complex. After complaining for years about the lack of usability in the repository world, I have come to believe that there must be some economic reason for building metadata systems that are difficult to use. The fact is that the difficulty in use is a direct result of the amount of functionality required by this custom solution. You see this in most implementations that focus the repository on integrating the data warehouse and extract, transform and load (ETL) environments. The end result of this effort is a highly functional and complex environment that requires heavy support from the metadata organization. Additionally, the product is focused on a single technology space, such as database metadata, which makes integrating this information with other vertically focused repository solutions a bit problematic. This is especially true when the core meta-models have little or no commonality as we have today.

The good news is that this type of solution has a high potential payoff with a huge return on investment (ROI) if done well. Many organizations have had success with the data warehouse-type metadata solution. However, there is a huge risk associated with this type of strategy. Not only is the organization dependant on you getting the metadata right, but what happens if the enterprise database effort fails to deliver the bottom-line value promised in the business case? Or, better yet, you nail the project requirements and they move the operations to India. Life in the world of metadata has never been easy. The ability for you to expand to other data sources is fairly easy, but bringing in new asset types is nearly impossible. The bottom line is that this type of strategy requires a tremendous amount of handholding and time focused on a single point of delivery.

Standardized Metadata Solution

What would happen if you took the standard implementation strategy and flipped the model upside down? Would it be possible to eliminate the risks, costs, complexities and other issues without creating new ones? The complex model focused on a small set of users and created value by enabling those few users to be more productive. Clearly, the simplified model cannot deliver that same value to these users, but perhaps we can deliver smaller increments of value to a much larger community. This means that we can add value exponentially versus linearly - but how?

The first step is to simplify the product and service offering. You wouldn't want to start cutting products or services without any logic so we must turn to our application logs and usage information. Which information components or functional utilities are the least used within your environment? Don't make any assumptions such as the physical definitions are surely being used by the development community - especially when most developers prefer to use the data management applications that come with the database system. How often does someone update the conceptual models? How often are the relationships updated that enable impact analysis? Cutting services can be much harder than actually creating them. Once you have you systems scaled to a point that a usability study doesn't fall apart in the first 10 minutes, you are ready to create lasting value across the board. For dramatic effect, let's use the 80:20 rule. Can you cut out 80 percent of the crap that doesn't add value so you can focus on the 20 percent that does? Metadata isn't the only type of organization that should be doing this; many other businesses reduce their product lines and focus their attention on specific value points.

When I was growing up in Columbus, Georgia, we had a barbecue restaurant that  we loved to visit. Just over the river in Phenix City, Alabama, was a small place called "Chicken Comers." The barbecue and sauce were the best we ever had, but they only served one dish: barbecue, white bread and a can of coke. Talk about a simplified product line. Chicken Comers understood that complexity begins in the product or service but then infiltrates every facet of the organization. The more complex the metadata offering the more complex the business processes, meta-model, quality assurance and communications.

The primary components of the customer-facing aspects of this model are customer self-service and branding. Clearly, the biggest payoff occurs when end users can service their own metadata with the solutions you create. Remember, users are not only producers of metadata information but also consumers. Therefore, the self-service model may apply to either or both of the user groups. In some cases, only the consumption of the meta data information can be automated into the self-service business model. A few years ago, Carly Fiorina talked about how businesses will create value not by vertically automating business process but by horizontally integrating them. The same is true with metadata implementation or any technology deployment; vertical integration will only have a limited value-creation. Horizontal or diagonal innovation can create value exponentially.

What about branding? Branding wasn't very important when you had a small-scale user base and you could basically walk each user though the application step-by-step. Unfortunately, that model doesn't scale very well in today's business environment. Branding is about managing the perceptions of your information, processes and applications. How will users know what's available, who provided the information, how the information is being used, what relationships exist, and where to go to get this information if you don't tell them? Communication of the value and utility of the metadata environment is critical in order to achieve the levels of ROI organizations demand.

This strategy won't work for everyone, but if you want to get a handle on enterprise asset management then this standardized strategy is a great start. Another point of view is that the core infrastructure and standards of the complex model have never materialized to the point where we actually can expand the utility. On the other hand, dramatic advances are being made in knowledge management, content management, metadata management, search technologies, semantic Web and many others that will speed up the adoption of the standardized model. The more organizations adopt this type of strategy, the greater the ability to create business agility. Technology professionals are beginning to see and understand that metadata is the integration point for the next shift in business value.

In many ways, it is difficult to understand how reducing the functionality, simplifying the user interface, automating the business processes, implementing metrics and integrating the repository into the architecture creates value. The truth is that in the complex model, these activates might not create the value and may very well destroy it. On the other hand, 70 percent of metadata implementations fail to meet the expectations of executives, and I can't help but wonder if they simply implemented the wrong strategy. In the end, the strategy you choose will be a result of reviewing your requirements and deciding how and where the business value of metadata will be created. The good news is that eventually the two strategies will merge due to the continuous improvements in usability, standardization of metadata exchange, automation of development tools, and the emergence of governance methodologies such as ITIL.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access