Enterprise data is a corporate asset that has the wherewithal to give added advantage to financial services firms on customer service, brand reputation and regulatory compliance. Master data is spread across the organization. Treasury, settlement, portfolio accounting, compliance, performance measurement, risk management, money markets, fixed income, equities and derivatives departments are some of the business units that store and use master data extensively. Use of their own data silos for master data makes the data redundant, inconsistent, expensive and manually intensive.


Clever management of master data would reduce much of the pain associated with redundant, inconsistent, expensive and manually intensive data. Master data management is one such discipline that aims to handle data intelligently.


MDM promises to establish a standard body of consistent, company-wide information. This information capsule is a single version of truth that can be used to meet client requirements and greatly reduce duplication of effort. Companies can add value, decrease risk, reduce costs and be compliant to regulation by replacing myriad decentralized data management solutions with a centralized or singular golden copy.


However, golden copy initiatives are hindered by the sheer size of the project, conflicting and often confusing requirements, lengthy time frames, escalating costs and lack of visible benefits. These factors serve to impede the progress of an MDM initiative. Overall, firms have begun to realize that building and managing a single golden copy version is more than a walk in the park.


A Different Approach to MDM Measurement


Organizations build data silos based on the business needs of various business units without giving much thought to "one architecture" and "one data" concepts. The same data is acquired from different sources, manipulated by the same rules with different names, stored in various silos and published to multiple clients. The numbers of sources, rules that are the same or similar, master data silos and publishing feeds are indicators of data fragmentation and data redundancy.


MDM can be viewed as an initiative to improve the existing data infrastructure of the firm rather than a from-scratch effort to build a standardized data platform. The basic premise of MDM is to reduce redundancy in data feeds, data manipulation rules, data storage and data publishing. So reducing the number of sources, rules, data silos and publishing feeds can be viewed as steps toward successful MDM.


After establishing the yardstick to measure MDM success, we need to look at the tools available to measure it. We view Herfindahl-Hirschman Index (HHI) as one of the apt methods to measure MDM success. According to the U.S. Department of Justice, HHI: 

Is a commonly accepted measure of market concentration. It is calculated by squaring the market share of each firm competing in the market and then summing the resulting numbers. For example, for a market consisting of four firms with shares of 30, 30, 20 and 20 percent, the HHI is 2600 (302 + 302 + 202 + 202 = 2600) ... Markets in which the HHI is below 1000 are considered to be fragmented, those between 1000 and 1800 points are considered to be moderately concentrated and those in which the HHI is in excess of 1800 points are considered to be concentrated.

Now we need to interpret HHI in the context of MDM. Let’s say the number of data silos for data storage is 10, with shares of 10, 12, 15, 8, 10, 13, 7, 5, 11 and 9 percent of master data. HHI for data storage concentration is 1078. HHI of 1078 indicates near fragmented data storage setup in the firm, which is an indicator of redundancy. A data consolidation initiative might streamline the data bins, thereby reducing the number of data storages. Assuming the number of silos for data storage as a result of this initiative is six with shares of 15, 20, 16, 15, 20 and 14 percent, HHI for data storage concentration is 1702. HHI of 1702 indicates moderately concentrated (on the higher side) data storage setup in the firm, which shows reduction in data redundancy. Reduction in data redundancy leads to consistent and quality data because more attention is paid to fewer data bins. HHI can be applied to data feeds, data manipulation rules and data publishing as well to determine the respective concentrations before and after any new data-related activity. Thus, one can relate the contribution of any data-related project to MDM initiative as a whole.


Data infrastructure in corporations existed way before the MDM concept came into picture. So for all its benefits, MDM cannot be thought of as a standalone initiative wherein a standard data platform needs to be built from scratch. It would be more pragmatic to take the existing data infrastructure and work on it to achieve the goals of MDM. In that way, MDM can be viewed as a philosophy rather than a data management project.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access