Suppose you and your spouse were vacationing in Florida, and you're taking a romantic stroll along the emerald coast. The night sky is brightly lit by a full moon and the waves are gently breaking along the coast. You turn your glance toward your best friend as both of you enjoy one of the few moments without the daily pressure of kids and work. As you admire the night sky, you see an airplane gliding along the horizon, the moon and stars shining brightly. Too bad this entire scenario is only a collection of moments in time assembled by our senses. These images you see are really just your mind assembling a perception of reality. This, of course, is a mirage of times passing. So how old are these images that we see?

  • Your Spouse: Less than a billionth of a second
  • The Waves: Less than a billionth of a second
  • The Distant Airplane: Less than a millionth of a second
  • The Moon: A second and a half
  • The Stars: Light years

In fact, a couple of thoughts emerge from this concept. For the stars that you see, do they really exist today? Of course not, by the time the light gets here, that star could have burned out and imploded into a black hole. Take another example, we know that tides are actually a result of the moon's gravitational pull. What would happen if I waved a magic wand at the moon and it instantly disappeared? Would the moon disappear first or would the tides begin to fall? Of course, this question is really asking which is faster light or gravity? The answer really doesn't matter, and I will let you look that one up if you are curious. The main point here is that the mind is processing information about the objects we see and then mapping that to our three-dimensional world. (Although, some scientist indicate we have 10 dimensions) Every object we see is assumed to be in the present time and position by height, width and depth. This, in turn, describes a location that we than can act upon. Of course, most objects are a billionth of a second away, so the element of time doesn't come into play. We can assume that reality is what surrounds us and our environment.
What does this have to do with meta data? In many ways, meta data acts as our brain does by assembling images of our environment and painting a picture of reality within the organization. Meta data shares many of the same characteristics as light images in that we can review meta data in order to determine the dimensions of data within the organization. How can we ensure the accuracy of the view we see of our organization's technical assets? The key is simple, distance remains a critical theme in ensuring truth in reality. Of course, distance in computing only makes sense in the communications area. In meta data, we can measure distance that exists by comparing the source of the asset to the meta data representation. An asset that is fully contained in the meta data is said to be in close proximity to the source. A great example of this is a physical database such as an Oracle or SQLServer table definition. Not only is the meta data held within the asset environment, any modification to the meta data has an immediate impact on the objects behavior. When people discuss concepts around active utility, they are trying to build an environment where the meta data is as tightly coupled with the asset as possible. Before we go off and celebrate nirvana, there are some issues that should be touched on. First, if the vendor or environment doesn't have this built in, your ability to integrate can be limited and very expensive. In addition, making changes can be dangerous without some level of control as many DBAs can attest. Finally, you are constraining your ability to react to the future of technology.

Now if we add a little bit of distance between the asset and the representation, we can see how the logical and physical models come into play. Our understanding of the asset increases while the distance also increases. Logical and physical models are not directly connected to the physical database and thus introduce the concept of distance and the possibility of inaccuracy. Yet, while the possibility of inaccuracy increases slightly, the understandability of the meta data increases exponentially. Walk up to your television set and stand about two inches from the screen and what do you see? Pixels! When you back up and introduce distance, your mind blends or averages these pixels into images that can be understood. Modeling does the same thing, introduce some distance and increase the understanding.

Moving toward an enterprise meta data implementation introduces additional distance but expands the coverage of assets well beyond data. As with the modeling example, does the increase in distance (hence, increased probability of error) override the increased utility? This is a difficult question that requires you to really think about your meta data strategy. Our organization works very hard to ensure the accuracy and quality of the meta data housed within the repository collection. Is 93 percent accuracy good enough for your organization? Random audits of our collection produces error rates around one to three percent, but the amount of effort is hard to justify to get this close. Although difficult to justify, I have no problem expanding the scope of assets, increase the level of integration and expanding the core meta-model, as long as the accuracy rate stays in the mid 90s. How about your implementation? Is senior management demanding 100 percent accuracy, then don't plan on moving beyond database system meta data.


Figure 1: Distances Impact on our Perceptions

Figure 1 provides a view of our discussion. Most of us agree that increasing the level of abstraction and moving toward a model-driven architecture is a great value add activity. Remember, adding distance doesn't break the connection between asset and representation, it only stretches it. Why are so many people telling us that true value from meta data can only be obtained when the distance is very short? This perception results from the inability to map business value to an implementation with any distance. This is unfortunate since most organization will opt out of an enterprise implementation for a smaller distance one. Yet, time and time again we review the ROI of these efforts and come away feeling shortchanged. The future ROI opportunities will not come from meta data hidden deep within a database or Web page. Take a look over the horizon at what's coming:

  • Sarbanes-Oxley
  • IT infrastructure library and asset management
  • Domain and business modeling
  • Implementing functional layer mapping
  • Service-oriented architecture
  • Model-driven architecture
  • Data architecture (Oh yeah, information architecture as well)
  • Enterprise architecture
  • IT portfolio management

Are you ready? Do you think you will be able to add value by implementing a physical database oriented implementation or a small distance application? Where will your organization be when accounting changes the definition of an asset and allows IT technical assets to be logged onto the corporate books? The one thing we know for certain is that the world of IT is changing and being reviewed with a different eye than years ago (Hence, "Does IT Matter" by Nicholas Carr). When our organization needed a data architecture, we were ready. When our organization needed an XML architecture, we were ready. When the CIO asked when we would have a Web service catalog, we were ready. When the organization implemented an enterprise reuse program, we were ready. Will you be ready when your organization calls?

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access