Every bank, asset manager, insurance company and other financial services institutions have been persuaded to change their operations through a variety of recent regulations. All these organizations have also witnessed exponential increases in the volume of data they collect from a variety of sources.

The regulatory landscape that permeates among banks and financial institutions has created a dichotomy that these organizations need to reconcile. On the one hand, these institutions have to manage their data practices in a manner compliant with the regulatory stipulations. On the other hand, the same institutions have to leverage data to glean insights to deliver new products and services to the market, understand customers better, reduce operational costs, improve risk management, achieve topline growth, etc.

In short, the tradeoffs these institutions are compelled to make between data insights and data governance affect major business areas.

Financial institutions and regulators have long realized that data quality is central to achieving complete transparency into markets and banking operations. Dodd-Frank regulations established rules to closely monitor and improve data quality of derivative asset classes. Likewise, the Basel Committee for Banking Supervision articulated the importance of data quality and sound data governance in its BCBS 239 regulation.

Data quality – the driver of organizational success

Within the scope of the current regulations, several banks are implementing data governance to manage data with the ultimate objective of producing regulatory reports in a compliant and timely manner. The next challenge facing these institutions and regulators is to leverage data as a strategic asset by deriving insights through analytics at the macro (understand markets and manage groups of banks and their subsidiaries) and micro level (understand of a specific product).

At a high level, strong data governance processes with sound validations at the time of ingestion, a well-designed enterprise data model and metadata management are important to produce good quality data. Historically, data transparency and quality have been instrumental in organizational success. Data quality not only directly impacts the success of major digital transformation initiatives launched by financial institutions, but it also plays a crucial role in their business agility and productivity.

Furthermore, poor data quality attracts penalties, fines, loss of reputation and trust. The basic requirement for any of these initiatives to succeed is high quality and trustworthy data, so the market trends and risks can be quickly identified and properly managed to exploit opportunities and to mitigate or circumvent risks.

Maintaining good data quality is an on-going activity that requires diligent application of processes and best practices by both business and IT teams. To maximize the investments made in data quality management, the first step involves gathering an enhanced understanding of the consequences of poor data quality and how data quality impacts the achievement of business axioms.

Gaps in data quality should be identified and resolved to harvest the intended benefits. Data quality management approaches should also attempt to quantify, in monetary terms, the direct and indirect costs of poor data quality. These measures will also help determine approaches to resolve any data inconsistencies and seek the executive support for data quality initiatives.

Start with business processes

Before embarking on large scale data quality programs, an assessment that identifies the business areas most impacted by data quality should be performed. Following this evaluation, a high-level plan should be developed and the effectiveness of a data quality remediation program assessed.

Data quality does not operate in vacuum; for a lasting impact, a ‘Quality Mindset’ is mandatory.

The speed of cultural change in achieving this mindset depends on executive support to a large extent. Furthermore, documented processes to identify and remedy quality issues at source and a scalable governance framework that clearly articulates the necessary roles and responsibilities are indispensable.

Misaligned or poorly designed processes are usually the sources of data quality issues; the problem is compounded when digital processes are automated. These processes not only rely on the underlying data for smooth execution, but also produce data.

An understanding of the business processes and the outcomes these activities influence is required. Hence, it is important to baseline, measure and prioritize urgent (short-term), emergent (mid-term) and evolutionary (long-term) data issues in the context of business processes.

Data quality assessment process

A data quality assessment activity should be initiated before the start of any quality improvement process. Such assessments should be conducted periodically rather than as a one-off effort to preempt any data quality issues from creeping in or diffusing through business processes.

Depending on the complexity involved, such assessments could be manual or automated; automation helps achieve scale and cost economies as well as provide a holistic snapshot of data quality across the organization. Preliminary assessments determine the baseline dimensions while subsequent assessments should be aligned with specific business processes with the objective of enhancing relevance, ongoing quality issue detection and improvement.

Each data dimension should be assessed using qualitative and quantitative methods.

Qualitative methods are based on subjective evaluation criteria derived from business needs; these assessments should be performed by subject experts or experienced professionals.

Quantitative methods are largely based on numeric data; these evaluation results are more objective and systematic. These assessments should also quantify the associated costs (direct and indirect), impacts (subjective and objective), risks (qualitative and quantitative), benefits to business operations and other relevant parameters to determine what proportion of the investments in data management programs should be directed towards data quality management.

Broadly speaking, a data quality assessment should address the following areas:

1. Impact analysis

2. Quality profiling

3. Anomaly reviews

4. Assessment reports

Impact analysis (BIA)

The objective of impact analysis is to uncover any negative business impacts that are directly caused by unacceptable data quality. In situations where business outcomes are highly reliant on data quality, a quantitative assessment using a few diligently chosen quality metrics will highlight the areas requiring further attention.

For critical business processes that are negatively impacted by data, a root cause analysis (RCA) should be performed to identify the problem hotspots. Diligence and expert judgements are required when adopting a quantitative approach to data quality, especially to ensure that correlation does not imply causation.

Process modeling and simulations can help pinpoint the source of quality issues that directly increase costs, reduce revenues, adversely affect profit margins or introduce inefficiencies in a controlled environment. An inventory of critical data elements (CDEs) can help scope the assessment activities and narrow down the list of data assets that warrant further examination.

Quality profiling

Data quality profiling techniques help discover and characterize important features and attributes of data assets. Quality profiling provides a snapshot of data architecture, data structures, content, rules and relationships by applying statistical methodologies.

The profiling activity entails performing a bottom-up review of the data sets to recognize anomalies that are real data flaws. Any obvious anomalies or outliers should be subject to further scrutiny in collaboration with business users.

Anomaly reviews

Not all data anomalies are data defects, though data anomalies might signal data defects. Anomalies usually result from good data being used in poorly designed processes or in a context completely different from the intended use.

During the anomaly review process, analysts review the anomalies to understand the relationships that exist between data errors and business process that triggered them. By segregating anomalies that have material impact, a backlog of prioritized issues should be created and mitigating strategies developed / implemented.

Statistical approaches, such as regression analysis and non-parametric methods, such as kernel functions and histograms, can also be used to detect anomalies. A scorecard that presents metric scores of the various data dimensions over a period of time will also enable trend analysis and highlight ongoing data quality improvement.

Assessment reports

Quality assessment reporting is the process of preparation and dissemination of reports conveying information about Data Quality to the relevant stakeholders. A quality report provides information on the main quality characteristics, so that the user of the report could easily understand and gain insights on data quality and its fitness for purpose.

Conclusion

Business goals should trigger data collection and quality improvement initiatives. Data is useless without the context in which it operates; business processes provide the context that’s vital to understanding how data is leveraged. If data quality activities, such as data cleansing, validation and enrichment are undertaken without understanding the business implication or processes they impact, these activities are doomed to be ineffective.

If upstream data-driven business processes are not accounted for, then the data quality improvement activities undertaken downstream are certain to be degraded by the same processes that triggered quality issues. Therefore, it is important to understand the data origin, how it was created, sourced, stored and updated along with the underlying business process.

Just as data governance is an important activity, business processes should, likewise, be governed.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access

Mithun Sridharan

Mithun Sridharan

Mithun Sridharan is a manager at Sapient Consulting based in Germany, where he leads data management and analytics programs with major financial services institutions across Europe.