Do you have an undertaking-wide common understanding of data quality? If not – I suggest you read on…
When a serious “data” problem arises in your organisation, how is it discussed? (By “serious”, I mean a data problem that has, or could cost so much money that it has come to the attention of the board).
What Data Quality KPIs does your board request, or receive to enable the board members understand the problem with the quality of the data? What data quality controls does your board expect to be in place to ensure that critical data is complete, appropriate and accurate?
If your board has delegated authority to a data governance committee, what is the data governance committee’s understanding of “Data Quality”? Is it shared across your organisation? Do you all speak the same language, and use the same terminology when discussing “Data Quality”? In brief – are you all singing from the same “Data Quality Hymn Sheet”?
Why do I ask?
For the first time, a regulator has stated that organisations must have an “undertaking-wide common understanding of data quality”.
Solvency II requires insurance organisations to demonstrate the data underpinning their solvency calculations are as complete, appropriate and accurate as possible. The guidance from the regulator goes further than that.
CP 56, paragraph 5.178 states: “Based on the criteria of “accuracy”, “completeness” and “appropriateness”… the undertaking shall further specify its own concept of data quality. Provided that undertaking-wide there is a common understanding of data quality, the undertaking shall also define the abstract concept of data quality in relation to the various types of data in use… The undertaking shall eventually assign to the different data sets specific qualitative and/or quantitative criteria which, if satisfied, qualify them for use in the internal model.”
Business Requirements should be clear, measurable and testable. Unfortunately, the SII regulator uses complex language, that make SII Data Quality Management and Governance requirements wooly, ambiguous and open to interpretation. My interpretation of the guidance is that the regulator will expect you to demonstrate your “undertaking-wide common understanding of data quality”.
What might a common understanding of data quality look like?
Within the Data Quality industry, commonly used dimensions of data quality include.
Is the data populated ?
Is the data within the permitted range of values ?
Does the data represent reality or a verifiable source ?
Is the same data consistent across different files/tables ?
Is the data available when needed ?
Is the data easily accessible, understandable and usable ?
Little did I know at the time I wrote the above blog post that a regulator would soon require organisations to demonstrate their understanding of data quality, and demonstrate that it is shared “undertaking wide”.
How might you demonstrate that your understanding of data quality is “undertaking-wide” and “common”?
You could demonstrate that multiple “data dependent” processes have a shared understanding of data quality (processes such as CRM, Anti Money Laundering, Anti Fraud, Single View of Customer etc.)
In the UK, the Pensions Regulator (tPR) has issued record keeping requirements which requires pensions companies to measure and manage the quality of their schemes data. I believe the Solvency II “independent third party” will at least expect to see a common understanding of data quality shared between Solvency II and tPR programmes.
What do you think? Please share…