Breaking the Bad Data Bottlenecks
Across industries, organizations are experiencing an increasing need to do more with less at the same time they are forced to cope with fast-paced, constrained and demanding business environments. This casts a spotlight squarely on leveraging existing information assets and investments and enabling building blocks to overall business and IT agility. Greater collaboration demands among business partners, the constrained economic environment, today's technology convergence and business agility requirements all factor into the expectations of business to manage and use integrated data more effectively and efficiently across the enterprise. Leading organizations have begun to address many of these data issues in response to strategic or tactical challenges with a variety of solutions, and much has already been written and discussed around common approaches to data management and architecture. However, many organizations are struggling to realize the promised value of delivery and integration due to the issues related to business alignment, broader business relevance, technology proliferation, evolving technology maturity and sustaining momentum. Common data-centric or technology-centric approaches have yielded less than desired outcomes in most cases and many companies are still searching for the right answers to problems that are spread across the entire enterprise. The complexity surrounding unstructured data and the gradual shift of application infrastructure to cloud are the next big frontier awaiting IT managers and CIO. It is quickly approaching a point where a data strategy that looked solid a few years ago requires a revamp or at least a significant refresh for many early adopters.
Many organizations have discovered that their key business initiatives are hitting a common data management bottleneck in the form of poor data integration and poor quality information. There are several root causes of bad data, many of which stem from the growth and complexity that characterize leading enterprises today:
• Multiple and overlapping applications (some of which hold the same data);
• Lack of business data ownership for in-process data quality maintenance;
• Growing size and complexity across the applications landscape;
• Exchanging data with external vendors, business partners and customers;
• Mergers and acquisitions among both customers and IT vendors;
• Business unit-centric versus process-centric views that lead to fragmentation;
• No clear “master” or authoritative source of core data; and
• Lack of enterprise data services and a mechanism to enable operational data integration.
While the root causes of fragmentation and deterioration of data quality are easy to understand, their impact on a company’s ability to sell, service and market effectively and manage supply chain operations is severe. The consequences include wasted effort and errors due to manual, repetitive data entry; lower customer satisfaction because call center and service personnel do not have the information they need; business partner frustration and competitive disadvantage due to inability to manage customized offerings in distribution channels; unrealized cost reduction opportunities due to lack of accurate global spend analysis that lead to uninformed vendor negotiations; unrealized revenue opportunities as a result of marketers’ inability to segment, analyze and target customers accurately; and increased compliance and legal risk.
With bad data and inaccurate information affecting so many areas of a business in so many ways, it stands to reason that companies must find a means to address bad data bottlenecks if they are to achieve a breakthrough in profitability or operational efficiency. Many companies have started to implement data management solutions that shift reliance from offline data warehouse systems to master data management and data services that support operational as well as analytical integration. Such solutions have typically taken the form of single-domain master data hubs and integration layers tasked with acquisition and syndication of specific information to different applications. Together the solution components are supposed to manage data quality and control master data distribution across the enterprise. For years, companies have invested heavily in MDM/data quality/enterprise application integration implementations and data warehouses in an attempt to attain reliable visibility, process integration and a single version of truth. But most have still not been able to fully solve their data issues, and data proliferation is still rampant.
An Integrated Single Version of Truth Remains Elusive
Many companies today face a collection of problems that lead to unrealized ROI or stalled data programs:
- Divisional or BU-specific data management platforms that are unable to scale enterprise-wide;
- Inability to mobilize active data governance after pilot attempts;
- Difficulty in achieving enterprise data integration;
- Making decisions due to the quickly changing vendor technology landscape and tools that are still maturing gradually;
- Architectural approach choices and options that have not yet fully matured;
- A single-domain to multidomain journey that isn’t clear;
- Business adoption and traction that fizzles out after an initial victory.
As a result, early adopters are revisiting their data strategies and rethinking some of the common mantras of data program undertakings. The goal still remains the same as depicted in the diagram above – to support global end-to-end process integration and analytics in a way that allows all key data and content objects to be complete, accurate, timely and available. This includes all touch points where data supports processes or insights, whatever application it is in, and however it is being used to drive business transactions or key decisions. By providing on-demand, loosely coupled, semantically integrated and traceable access to cleansed, current and comprehensive data, the constraints on your key initiatives can be remediated by an overall information architecture without forcing your entire company into a single application and set of business logic. The benefits of this end state are profound from a business and IT transformation standpoint, but the journey remains unique for each company.
Having collaborated with many leading vendors and worked with executives of many large companies, I have distilled some of the hard-learned execution themes as key considerations you may find helpful:
Think process first, technology later. Take a big-picture view of the solution itself. Start with the business processes, data lifecycle mappings and supporting data models and focus on making them work end-to-end with cross-functional participation. Only then should you move on to the technical components and their design. If you don’t seamlessly integrate data strategy into overall business process engineering and tie benefits to process key performance indicator improvements, the business will not adopt the solution, regardless of executive sponsorship, business case or architectural validity.
Consider the politics of data. Don’t ignore the political reality of information and data. A lot of people within your organization “own” data as they progress through their capture and consumption of information. Everyone would like to own the definitions, but not the data quality. Regardless of top-down sponsorship, do not expect a data governance program to click if it cannot enforce accountability at or near capture with controls that can tangibly improve the efficiency or effectiveness of a process area or function. There is no cookie-cutter approach to address specific cultural and organizational dynamics – choose your battles where there is immediate bang for the buck with the right change agents willing to evangelize tradeoffs for the common enterprise good.
Follow a transformation agenda for success. Focus on business enablement and visible value delivery, not just data or technology remediation. Data programs closely attached to significant business or IT transformation programs have much greater chances of success than those without such leverage. Replacing or overhauling underpinnings of your enterprise applications require more than a sound architectural approach or appreciation of data-related problems. Broader disruption of the operational landscape is just too difficult or too painful to facilitate the change required in process, architecture and applications across the board. If tied directly and tangibly to the value statement of your key strategic enterprise initiatives, data programs can gather change momentum and an opportunity to affect the required overhaul.
Understand different approaches of data architecture. It is important to recognize that an MDM or data management architecture is not just an implementation of some central data hub. Instead it is a new, disciplined approach to data management that puts business process optimization and in-process governance first while taking advantage of a new class of architectures and data management services. New concepts and buzzwords for different approaches abound, but it simply boils down to what’s relevant for your industry, the specific set of challenges that you intend to address and how effectively you can leverage your enterprise standards and platforms to enable the required services.
Finally, don’t overlook quickly evolving next-generation needs. Just when you thought you had an answer for tackling structured data issues, many of you are probably now staring at the next frontiers of unstructured and ill-managed content such as emails, wiki, social computing, blogs, chats and Web content proliferation as well as game-changing ideas for cloud computing, Web 2.0, or mobile solutions. Technology consolidation, changing infrastructure models and information dynamics make it imperative for companies to revisit and refresh their information strategies to stay current and relevant. Many of the subplots, themes and considerations have been touched on here to highlight the new dynamics of data management. I will explore some of these in more detail throughout the rest of this series.