Master data about customer, products employees, suppliers and other information assets is pervasive and fundamental to business success. Historically, master data has moved into the foreground through the needs of back-end systems such as data warehouses. However, all transactions and dimensions of information architecture - front-end, middle and back - invoke and make use of master data whether operational or analytic.

Thus, the reasons that master data management (MDM) has percolated up as a corporate priority and an increasingly important trend include:

  • The business gets it. The requirement to rationalize master data resonates well with the business. Businesspeople know what are customers, prospects, products, and suppliers. Building a business case is relatively easy for getting a handle on and reversing the proliferation of islands of information containing these key business entities.
  • Lack of master data discipline is causing system breakdowns. The fragmentation of master files through the proliferation of enterprise resource planning (ERP) systems, data marts, operational customer relationship management (CRM) and business intelligence (BI) silos has reached a critical mass. Lack of consistency is causing a breakdown in functionality. Hence, the requirement for remedial action.
  • Innovation happens - in this case, around master data. Improvement in processing power (Moore's Law), software design (service-oriented architecture, or SOA) and metadata management enable complex issues to be engaged in data volumes and heterogeneity that could not be handled only a few years ago. New opportunities to drive top-line growth through cross-selling and up-selling to customers as well as forecasting and demand planning for products requires the design and implementation of a consistent unified view of customers, products and related master data.

MDM is an area where connecting the dots between the business role and the underlying technology is relatively straightforward. Obviously, customers and products are perfectly understandable to those trying to operate the business. Procurement officers understand the need to consolidate total business by dollar amount with vendors and suppliers to qualify for volume discounts. The sales function understands the usefulness of managing customer relations as a totality of interactions that aggregate across particular point transactions and result in a customer lifetime value (LTV). IT staff understand that the payoff from implementing systems that support master data management is easier to quantify as a business case than relatively technical initiatives such as service-oriented architecture. Savvy IT functions actually use MDM to drag nerdy, albeit critical, initiatives such as SOA through the justification cycle, because SOA can be a critical enabler for MDM.
MDM is on the critical path to initiatives such as ERP and data mart consolidation. If two ERP systems make use of a different customer master file, that data must be rationalized, conformed and integrated before the consolidation can be completed. If two data marts make use of a different product master a similar scenario occurs - they must be made consistent before the data marts can be functionally merged. In both cases, master data rationalization is on the critical path to application integration and optimization.

Address master data strategically, not just reactively as an afterthought in the data warehouse or pushed down beneath a data mart consolidation project. Master data management enables (causes) data mart consolidation, not vice versa. In every case, separate master data management from the application logic enable the system architecture to advance from many-to-many point interfaces (see Figure 1) to an optimally economical one-to-many master data hub (see Figure 2).

Figure 1: Many-to-Many Architecture

Figure 2: One-to-Many Master Data Hub Architecture

Success with master data management results from following several basic design principles. These include:

  • Separate ("decouple") basic master data from the application logic. This is justified by the basic design principle of tight coherence, loose coupling. Different applications - order entry, inventory control, sales automation, marketing - all make use of the same processes to access, update and maintain basic master data. The risk of inconsistent reporting on the same information is eliminated because the same methods are used wherever the information is served up. Many-to-many interfaces are replaced with a one-to-many interface that represents the optimal (lowest) number of connections for a given portfolio of applications. A single version of the (master data) truth enables a variety of breakthrough business solutions including data mart consolidation, ERP consolidation, top-line revenue growth through cross selling and up selling, and cost reductions through supply chain optimization.
  • Reinventing the wheel is hard work and many enterprises have wisely purchased off the shelf applications rather than built from scratch. Yet such an approach has a dark side that is increasingly visible with the benefit of 20/20 hindsight. While buying an application - whether order entry, operational CRM or sales automation - makes sense when the package furnishes a majority of the required functionality, it results in a proliferation of master files. Rarely are these files conformed or semantically consistent. The coordination costs of reconciling these instances can be reduced or eliminated by centralizing master data management.
  • Decide which instance is the driver ("master") and which is the driven ("slave"), and stick to it. Avoid peer-to-peer synchronization of multiple instances of master data due to complexity, the risk of logical anomalies, and the high cost of maintenance and debugging. The flow of master data updates through the information supply chain is bidirectional - for example, because a department must be able to upgrade a prospect to a customer in order to complete a sale - then define a secure, auditable method for the department ("slave") to request an update from the master. The approval direction must be one way, even if the data flow is not. This is where a technology such as a message broker can work wonders in squeezing latency out of the workflow process.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access