Major step forward in Data Quality Measurement

How tall are you?
What is the distance between Paris and Madrid?
How long should one cook a 4.5Kg turkey for – and at what temperature?

Image of measuring tapes courtesy of pixabay
Quality data is key to a successful business. To manage data quality, you must measure it – Image courtesy of Pixabay.com

We can answer the above questions thanks to “standard dimensions”:

 

Height: Metres / Feet
Distance: Kilometres / Miles
Time: Hours & Minutes
Temperature: Degrees Celsius / Farenheit

Life would be impossible without the standard dimensions above, even though the presence of “alternate” standards such as metric Vs Imperial can cause complexity.

We measure things for a reason. Based on the measurements, we can make decisions and take action. Knowing our neck size enables us to decide which shirt size to choose. Knowing our weight and our waist size may encourage us to exercise more and perhaps eat less.

We measure data quality because poor data quality has a negative business impact that affects the bottom line.  Rectifying data quality issues requires more specific measurement than anecdotal evidence that data quality is “less than satisfactory”.

The great news is that 2013 marked a major step forward in the agreement of standard dimensions for data quality measurement.

In October 2013, following an 18 month consultative process DAMA UK published a white paper called DAMA UK DQ Dimensions White Paper R3 7.

The white paper lists 6 standard data quality dimensions and provides worked examples. The 6 are:

1. Completeness
2. Uniqueness
3. Timeliness
4. Validity
5. Accuracy
6. Consistency

The dimensions are not new. I referred to 5 of them in a blog post in 2009 There is little understanding among senior management of what “Data Quality” means.
The good news is that this white paper pulls together the thinking of many DQ professionals and provides a full explanation of the dimensions. More importantly, it emphasises the criticality of assessing the organisational impact of poor data quality. I include a quote below:

“Examples of organisational impacts could include:
• incorrect or missing email addresses would have a significant impact on any marketing campaigns
• inaccurate personal details may lead to missed sales opportunities or a rise in customer complaints
• goods can get shipped to the wrong locations
• incorrect product measurements can lead to significant transportation issues i.e. the product will not fit into a lorry, alternatively too many lorries may have been ordered for the size of the actual load
Data generally only has value when it supports a business process or organisational decision making.”

I would like to thank DAMA UK for publishing this whitepaper. I expect to refer to it regularly in my day to day work. It will help me build upon my thoughts in my blog post Do you know what’s in the data you’re consuming?

Hopefully regulators worldwide will refer to this paper when considering data quality management requirements.

Some excellent articles / blog posts / videos referring to this whitepaper include:

Nicola Askham – Data Quality Dimensions

3-2-1 Start Measuring Data Quality ()

Great Data Debate (2) Danger in Dimensions, Kenneth MacKinnon

How do you expect this paper will affect your work? Please share your thoughts. 

FSA imposes £2.4million fine for inadequate risk reporting systems

London 18th March 2013 – FSA imposes £2.4million fine for inadequate risk reporting systems, which led to a failure to keep investors informed ahead of a profit warning which wiped 57% off the company’s share price. (See London Evening Standard: “Watchdog gets tougher as oil-rig firm Lamprell is fined £2.4 million over stock market breach“).

Oil services group Lamprell is not a bank. However, Lamprell could have avoided this fine, if they had implemented the new BCBS principles for effective risk data aggregation and risk reporting practices (BCBS 239), as published in January 2013; principles, which I describe in a previous post as Data aggregation and reporting principles – applied common sense

I include below some quotes from the article, and in parentheses, the relevant text from the BCBS 239 principles:

  • The FSA said that monthly reports to the board had been totally inadequate for a company of its size and that such reports were delivered late.”
    (Principle 5: Timeliness. Paragraph 44 “A bank’s risk data aggregation capabilities should ensure that it is able to produce aggregate risk information on a timely basis to meet all risk management reporting requirements.”)
  • “It also said the takeover of a rival in 2011, which doubled Lamprell’s size, had left the company using too many different reporting systems.”
    (Principle 1 Governance. Paragraph 29. A bank’s risk data aggregation capabilities and risk reporting practices should be… Considered as part of any new initiatives, including acquisitions and/or divestitures… When considering a material acquisition, a bank’s due diligence process should assess the risk data aggregation capabilities and risk reporting practices of the acquired entity, as well as the impact on its own risk data aggregation capabilities and risk reporting practices. The impact on risk data aggregation should be considered explicitly by the board and inform the decision to proceed. The bank should establish a timeframe to integrate and align the acquired risk data aggregation capabilities and risk reporting practices within its own framework.)

Tracey McDermott, FSA director of enforcement and financial crime, said: “Lamprell’s systems and controls may have been adequate at an earlier stage, but failed to keep pace with its growth. As a result they were seriously deficient for a listed company of its size and complexity, meaning it was unable to update the market on crucial financial information in a timely manner.”

The moral of the story… ensure your organisation, regardless of your industry, applies the common sense set out in: “Data aggregation and reporting principles (BCBS 239) – applied common sense“.

The growing demand for food and data provenance

In November 2012, I presented at the Data Management and Information Quality Europe 2012 conference, in London. My presentation was called Do you know what’s in the data you’re consuming.

In the presentation, I compare the data supply chain with the food supply chain.

I believe that data consumers have the right to be provided with facts about the content of the data they are consuming, just as food consumers are provided with facts about the food they are buying. The presentation provides guidelines on how you can improve your data supply chain.

Little did I realise that within 3 months the term “provenance” would be hitting the headlines due to the European horsemeat scandal.

There’s a silver lining in this food scandal for data quality management professionals. As financial regulators increasingly demand evidence of the provenance of the data provided to them, it is now easier for data quality management professionals to explain to their business colleagues and senior management what “data provenance” means, and what it requires.  Retailers, such as Tesco, must have controls in their supply chain that ensure that the food they sell to consumers only contains “what it says on the tin”. Similarly, financial services organisations providing data to financial regulators must have controls in their data supply chain that ensure the quality of the data they provide can be trusted. Regulators are now asking financial services organisations to demonstrate evidence that their data supply chain can be trusted. They require organisations to demonstrate evidence of their data provenance, as applied to their critical or material data.

But what exactly is “data provenance”? The best definition I have seen comes from Michael Brackett in his excellent book “Data Resource Simplexity“.

“Data Provenance is provenance applied to the organisation’s data resource. The data provenance principle states that the source of data, how the data were captured, the meaning of the data when they were first captured, where the data were stored, the path of those data to the current location, how the data were moved along that path, and how those data were altered along that path must be documented to ensure the authenticity of those data and their appropriateness for supporting the business”.

Enjoy your “beef” burger!

Basel Committee issues “Principles for effective risk data aggregation and risk reporting (BCBS 239) – final document”

Today, 9th Jan 2013, the Basel Committee on Banking Supervision issued a press release announcing the final document “Principles for effective risk data aggregation and risk reporting (BCBS 239).

I wrote two posts on the consultation paper, when it was issued in June 2012.

1. Data aggregation and reporting principles (BCBS 239) – applied common sense

2. Risk data aggregation and risk reporting (BCBS 239) – Board and senior management responsibilities

I’m pleased to see that the final document retains the applied common sense approach set out in the June 2012 version. Working together with other independent data professionals, I provided feedback to the Basel Committee on the June consultation paper. Our feedback, together with that from banking organisations worldwide was also published today – See feedback and comments. The final document has taken on board the feedback from all of the contributors and provided additional clarification where requested. For example, Annex 1 has been expanded considerably to explain all of the “Terms used in the document”.

Banks need to start working on compliance plans immediately. To quote paragraph 87 of the document “National banking supervisors will start discussing implementation of the Principles with G-SIB’s senior management in early 2013. This will ensure that banks develop a strategy to meet the Principles by 2016.”

While compliance with the principles is mandatory for G-SIBS (Globally Systemically Important Banks), the document makes it clear that all banks should implement the principles. Paragraph 15 states “It is strongly suggested that national supervisors also apply these Principles to banks identified as D-SIBs (Domestic systemically important banks) by their national supervisors three years after their designation as D-SIBs.  

As I point out in Data aggregation and reporting principles (BCBS 239) – applied common sense, all organisations in all industries would benefit by applying these principles. Simply remove the word “Risk”, and one has “Principles for effective data aggregation and reporting”.

What do you think? Please post your views below. Join the debate.

The dog and the frisbee and data quality management

The Wall Street journal reported it as the “Speech of the year“.

In a speech with the intriguing title “The dog and the frisbee“, Andrew Haldane, the Bank of England Director of Financial Stability has questioned whether the Emperor (in the form of ever increasing, ever more complex regulations such as Solvency II, BASEL III and Dodd Frank) is naked. He points out that the BASEL regulations, which have increased from 30 pages to over 600 pages completely failed to identify banks that were at risk of collapse, while a simple measure of the bank’s leverage ratio did identify them.

He also points out “Dodd-Frank makes Glass-Steagall look like throat-clearing.” The Glass-Steagall act of 1933, which separated commercial and investment banking, ran to a mere 37 pages; the Dodd-Frank act of 2010 ran to 848, and may spawn a further 30,000 pages of detailed rule-making by various agencies.

I recommend you read the speech yourself – his arguments, together with his wit are superb. I include a brief extract below:

‘In the UK, regulatory reporting was introduced in 1974. Returns could have around 150 entries. In the Bank of England archives is a memo to George Blunden, who was to become Deputy Governor, on these proposed regulatory returns. Blunden’s handwritten comment reads: “I confess that I fear we are in danger of becoming excessively complicated and that if so we may miss the wood from the trees”.

Today, UK banks are required to fill in more than 7,500 separate cells of data – a fifty-fold rise. Forthcoming European legislation will cause a further multiplication. Banks across Europe could in future be required to fill in 30–50,000 data cells spread across 60 different regulatory forms. There will be less risk of regulators missing the wood from the trees, but only because most will have needed to be chopped down.’

Brilliant !

Andrew Haldene is calling for more simple, basic rules. I agree with him,

I have worked in data management for over 30 years. The challenges I see today are the same challenges that arise time and time again. They are not Solvency II specific, BASEL specific, or Dodd Frank specific. They are universal. They apply to all critical data within all businesses.

The fundamental truth is “The data is unique, but the data management principles are universal”

It is time to stop writing specific data management and data quality management requirements into specific legislation.  Regulators should co-operate with the data management profession, via independent organisations such as DAMA International, to develop a common sense universal standard, and put the effort into improving such a standard.

What do you think? I welcome your comments.

Data aggregation and reporting principles – applied common sense

Principles for effective risk data aggregation and risk reporting

Basel Consultative Document
Data aggregation and reporting principles (BCBS 239)

Those of you familiar with my blog will know that I am a fan of common sense.

I believe that data quality management requires one to apply common sense principles and processes to your data.  I believe that the same common sense principles apply regardless of the industry you are in.

Your data will be unique, but the common sense questions you must ask yourself will be the same.  They include:

  • What MI reports do we need to run our business?
  • What critical data do we need in our MI reports?
  • Who owns and is responsible for gathering the critical data we need in our MI reports?
  • What should our critical data contain?
  • What metrics do we have to verify our critical data contains what it should?
  • etc…

Click on the image to see a document that lists what I regard as “common sense” data aggregation and reporting principles.  They were published as a consultative document on 26th June 2012 by the Basel committee on Banking Supervision (BCBS). The principles are commonly known as BCBS 239. The committee invited comments from interested parties, which are available at http://www.bis.org/publ/bcbs222/comments.htm. I co-operated with a group of fellow independent data professionals to comment and you may see our comments at http://www.bis.org/publ/bcbs222/idpg.pdf. You may see the final version at http://www.bis.org/publ/bcbs239.pdf. The largest banks in the world (known as Global Systemically Important Banks, or G-SIBS) must comply by Jan 2016. Other, “Domestic Systemically important banks”, or D-SIBS, must reach compliance three years after the date on which they were so designated, which varies by bank. Many received their designation during 2014.

While the document is targeted at risk management within the banking industry, the principles apply to all industries. The document explicitly refers to “Risk data aggregation and risk reporting” – I suggest you ignore the word risk and read it as “data aggregation and reporting principles”.

Over the next while I plan to explore some the principles proposed in the document. I plan to explore the practical challenges that arise when one seeks to implement common sense data quality management principles. I welcome your input.  If you have a specific question – let me know – I will do my best to answer it.

Risk data aggregation and risk reporting – Board and senior management responsibilities

The Queen’s Speech and Data Governance

The Queen of England made an historic and welcome visit to Ireland in 2011.  She delivered a memorable speech at the Irish State banquet, in which she said “With the benefit of historical hindsight, we can all see things which we wish had been done differently, or not at all”.

In real life, we cannot change the past.  The same does not apply to data created in the past.  Regulators now expect financial institutions to:

  • Identify data quality mistakes made in the past
  • Correct material mistakes
  • Implement data governance controls to prevent recurrences

I quote from the UK Financial Regulator’s requirement that all deposit holding financial institutions deliver a single customer view (SCV) of deposit holders:  “There may be a number of reasons why SCV data is not 100% accurate. This might be due to defects in the systems used to compile the SCV, but we would expect such defects to be picked up and rectified during the course of the systems’ development.”

Dodd-Frank, Solvency II, FATCA, BASEL III and many more regulations all require similar. Use this checklist to check if your organisation suffers any common Enterprise-Wide Data Governance Issues.

What data quality mistakes have you uncovered from the past, and how have you corrected them? I’d love to hear about them.