BCBS 239 compliance D-Day – Data Quality Risk Checklist

It’s that time of year again, when Santa Claus, the original Data Quality Risk Manager, makes his list and checks it twice.

Risk Signpost

BCBS 239 requires Data Quality Risk to be included in a bank’s overall Risk Framework 

For the largest 30 banks in the world (known as G-SIBs), teams of experts are making final preparations ahead of the BCBS 239 compliance D-Day, which is 1st Jan 2016.

Based on the BCBS 239 document, I’ve put together a “Data Quality Risk Check-list”, that the bank’s board and senior management should sign off, after checking twice:

  1. We have updated our Risk Management Framework to include the identification, assessment and management of data quality risks
  2. We consider risk data accuracy requirements analogous to accounting materiality
  3. Our data quality risk controls surrounding risk data are as robust as those we apply to accounting data
  4. We reconcile our risk data with our sources, including our accounting data sources where appropriate, to ensure that our risk data is accurate
  5. We’ve established data taxonomies across the banking group, which includes information on the characteristics of the data (metadata), as well as use of single identifiers and/or unified naming conventions for data including legal entities, counterparties, customers and accounts
  6. We have defined our data consistently across our organisation and we hold the concepts we use and our data definitions in a “dictionary”
  7. We’ve established roles and responsibilities as they relate to the ownership and quality of risk data and information
  8. Our business owners ensure that data is correctly entered by the relevant front office unit (at source), kept current and aligned with the data definitions
  9. We measure and monitor the completeness, accuracy, timeliness and integrity of all material risk data and we have appropriate escalation channels and action plans in place to rectify poor data quality

BCBS 239 is a paradigm shift in Data Quality Risk management thinking.

Note: Major banks outside the top 30 in the world (known as the D-SIBs) have a little more breathing space. They will be required to comply with BCBS 239 within  three years of being designated as a D-SIB by their national supervisor. They have the opportunity to learn for the experience of the first wave.

Data is the new oil – what grade is yours?

Bill Bryson’s book “One Summer: America 1927” provides a fascinating insight into the world of Aviation in the “roaring 20’s”. Aviators were vying to be the first to cross the Atlantic from New York to Paris, a challenge that took many lives, most of which were European.  

Bryson tells us “The American flyers also had an advantage over their European counterparts that nobody yet understood. They all used aviation fuel from California, which burned more cleanly and gave better mileage. No one knew what made it superior because no one yet understood octane ratings – that would not come until the 1930s – but it was what got most American planes across the ocean while others were lost at sea.

Once octane ratings were understood, fuel quality was measured and lives were saved.

We’ve all heard that data is the new oil. To benefit from this “new oil”, you must ensure you use “top grade” only. It can make the difference between business success and failure. It is also a prerequisite for Regulatory compliance, (GDPR, Solvency II, FATCA, Dodd Frank, Basel III, BCBS 239 etc.). Thankfully, like octane ratings, we know how to measure data quality using 6 primary dimensions: completeness; validity; accuracy; uniqueness; timeliness and consistency. For more details see my post: Major step forward in Data Quality Measurement.

I also explore this topic in my post Russian Gas Pipe and Data Governance.

What happens in your organisation? Do you measure the quality of your most critical data, or do you fly on a wing and a prayer? Please add your comments below.

Do you know what’s in the data you’re consuming?

Standard facts are provided about the food we buy

These days, food packaging includes ingredients and a standard set of nutrition facts.  This is required by law in many countries.

Food consumers have grown accustomed to seeing this information, and now expect it. It enables them to make informed decisions about the food they buy, based on a standard set of facts.

Remarkable as it may seem, data consumers are seldom provided with facts about the data feeding their critical business processes.

Most data consumers assume the data input to their business processes is “right”, or “OK”.  They often assume it is the job of the IT function to ensure the data is “right”.  But only the data consumer knows the intended purpose for which they require the data.  Only the data consumer can decide whether the data available satisfies their specific needs and their specific acceptance criteria. To make an informed choice, data consumers need to be provided with facts about the data content available.

Data Consumers have the right to make informed decisions based on standard data content facts

The IT function, or a data quality function, can, and should provide standard “data content facts” about all critical data such as the facts shown in the example.

In the sample shown, a Marketing Manager wishing to mailshot customers in the 40-59 age range might find that the data content facts satisfy his/her data quality acceptance criteria.

The same data might not satisfy the acceptance criteria for a manager in the Anti Money Laundering (AML) area requesting an ETL process to populate a new AML system.

Increasing regulation means that organisations must be able to demonstrate the quality and trace the origin of the data they use in critical business processes.

In Europe, Solvency II requires insurance and re-insurance undertakings to demonstrate the data they use for solvency calculations is as complete, appropriate and accurate as required for the intended purpose. Other regulatory requirements such as Dodd Frank in the USA, BASEL III and BCBS 239 are also seeking increasing transparency regarding the quality of data underpinning our financial system.

While regulation may be a strong driving force for providing standard data content facts, an even stronger one is the business benefit that to be gained from being informed.  Some time ago Gartner research showed that approximately 70% of CRM projects failed.  I wonder were the business owners of the proposed CRM system shown data content facts about the data available to populate the proposed CRM system?

In years to come, we will look back on those crazy days when data consumers were not shown data content facts about the data they were consuming.

Process for assessing status of common Enterprise-Wide Data Governance Issues

If you work with data in large enterprises, you will be aware that the data, and the ability of the business to access that data is seldom as “good” as it should be.  But just how “good” or “bad” is it?

This post outlines a process for assessing the status of common Enterprise-Wide Data Governance  issues within your enterprise, or that of a client.  I use it as the basis for my “Data Governance Health Check”.

These issues can impact your ability to deliver the underlying data required for meaningful CRM, Business Intelligence, etc.. More seriously, they can impact your ability to satisfy regulatory compliance demands (e.g. GDPR, BCBS 239, Solvency II, Anti Money Laundering, BASEL II etc.) in a timely cost effective manner.

Do issues like these affect your enterprise?  If not, how have you resolved or prevented them?  Please share your experience by posting a comment.

Common Enterprise-wide data governance issues:

1. Quality of informational data is not as high as desired


2. Quality of data entered by front-end staff is not as high as desired


3. No culture of Data as an ‘asset’ or ‘resource’


4. No clear ownership of data


5. Business Management don’t understand what “Data Quality” means


6. No Enterprise Wide Data Quality Measurement of Data Content


7. No SLAs defined for the required quality level of critical data


8. Accessibility of data is poor


9. Data Migration and ETL projects are Metadata driven


10. No Master repository of Business Rules


11. No ownership of Cross Business Unit Business Rules


12. No Enterprise Wide Data Dictionary


13. Islands of Data

14. No Enterprise Wide Data Model

Explanation of the scale and the process for using it:

There are 6 levels on the scale, starting at level 1, and increasing to level 6.  The higher the score, the better prepared the organisation is to deal with the issue.  The worst case scenario is actually a score of ZERO, which means that management in the enterprise is not even aware that the issue exists.  To assess the actual status of an issue, ask for documentary evidence to illustrate that the Enterprise  has actually reached that level:

Figure 1: Status of a (data governance) issue.

1. Aware Senior Management is aware that the issue exists.e.g. Data Quality is not measured, or measured in ad-hoc manner.#Evidence: Captured in Issues Log or Requirements document.
2. Understands Senior Management fully understands the issue; the impact of not addressing it; options available to address it, complete with the pros and cons of each option.e.g. Issue paper explains the impact of no Data Quality Metrics on downstream data dependent projects etc.Evidence: Issue Paper, Rationale paper or Point of View paper(s).
3. Policy defined Senior Management has a clearly stated policy/strategy identifying the selected option.e.g. Data Quality Measurement must be performed by each Business Unit, using a standard Enterprise Wide Data Quality Measurement process….Evidence: Policy document / Design Principles/ Communications/ education material
4. Process defined The organistaion has a clearly defined process detailing exactly how the policy / strategy will be implemented, which common services / utilities must be used, and exactly how to use them.E.g. The standard Enterprise Wide Data Quality Measurement process will use ‘off the shelf tool X’, to produce a standard set of Data Quality metrics….Each BU must train N staff in the use of the tool.  Training will take place……Evidence: End To End Process documentation / Education and Training material.
5. Infrastructure in place Infrastructure (systems / common services / utilities) needed to implement the process is in place.E.g. ‘off the shelf tool X’ has been licenced and installed Enterprise Wide.  Staff have been trained …Pilots have been run…Evidence: Programme Infrastructure document / Utility user manuals.
6. Governance in place Governance is in place to ensure that the defined policy is implemented in accordance with the defined process.E.g.  The stakeholders are…The Data Steering Enterprise includes the CIO and ….The reporting process is….. The following controls are in place….Evidence: Programme Governance document / Education / completed sign-offs

Your experience:
How do you assess Data Governance within your organisation, or that of a client? Please share your experience by posting a comment – Thank you – Ken.