If Carlsberg did Data Governance

In this part of the world, we’re treated to wonderful TV ads from Carlsberg, with the theme “If Carlsberg did…, it would probably be the best…. in the world”. One of my favourites is “If Carlsberg did Haircuts”.

This led me to think, what if Carlsberg did Data Governance?
Picture the scene… Your CEO is looking for a new report, and she has tasked you with providing it, by close of business tomorrow! Where do you start?

In steps the “Data Waiter”, who presents you with a complete menu of the data in your organisation, suggests the data required for the new report and then prompts you with friendly questions like:

  • How would you like your data sliced?
    Would you like it by Geography, or Business line? Perhaps by Product, or by Customer Type?
  • What time period would you like your data to cover?
    For a light snack, I can recommend a “Point in Time” snapshot? Or perhaps you would like to include the past month? I can recommend the house special, which is a “Trend over time” for the past year.
  • How up to date would you like your data to be?
    The early-bird menu has some lovely data we captured 2 years ago. For a $10 supplement you can have data from 1 year ago. On the a la carte menu, you can choose from a wide range, from 6 months old to near-real-time.
  • How often would you like your data?
    Would you prefer a once-off or perhaps a weekly / monthly data extract? We do a lovely daily extract, or perhaps you would like real-time data-streaming?
  • What level of trust does your CEO need in the report you’re preparing?
    The early-bird menu has a fresh slice of big data. It’s a beautiful visualisation that looks really pretty – your CEO will love it. I’ve been assured that the data was so big that there’s no need to worry about the quality of it. (Editor note: Beware of Big Data Bullshit – Look up “Veracity” which is the critical, but often overlooked, 4th “V” of Big Data).
    If your CEO needs a higher level of trust in your report, we have a complete selection of data that we’ve traced from data entry to our own reporting data warehouse, complete with data quality metrics along the data supply chain.

Having selected the data you need, the data waiter scans your retina, confirms you have appropriate access authority, and then delivers the data to your preferred location. You prepare your report and your CEO is so delighted that she promotes you to the senior management team… Happy days! Scene ends.

What services would you like from “The best Data Governance in the world”?

For more about “Trust in data” see my blog post “The growing demand for food and data provenance“.

This article originally appeared on LinkedIn Pulse

BCBS 239 compliance D-Day – Data Quality Risk Checklist

It’s that time of year again, when Santa Claus, the original Data Quality Risk Manager, makes his list and checks it twice.

Risk Signpost

BCBS 239 requires Data Quality Risk to be included in a bank’s overall Risk Framework 

For the largest 30 banks in the world (known as G-SIBs), teams of experts are making final preparations ahead of the BCBS 239 compliance D-Day, which is 1st Jan 2016.

Based on the BCBS 239 document, I’ve put together a “Data Quality Risk Check-list”, that the bank’s board and senior management should sign off, after checking twice:

  1. We have updated our Risk Management Framework to include the identification, assessment and management of data quality risks
  2. We consider risk data accuracy requirements analogous to accounting materiality
  3. Our data quality risk controls surrounding risk data are as robust as those we apply to accounting data
  4. We reconcile our risk data with our sources, including our accounting data sources where appropriate, to ensure that our risk data is accurate
  5. We’ve established data taxonomies across the banking group, which includes information on the characteristics of the data (metadata), as well as use of single identifiers and/or unified naming conventions for data including legal entities, counterparties, customers and accounts
  6. We have defined our data consistently across our organisation and we hold the concepts we use and our data definitions in a “dictionary”
  7. We’ve established roles and responsibilities as they relate to the ownership and quality of risk data and information
  8. Our business owners ensure that data is correctly entered by the relevant front office unit (at source), kept current and aligned with the data definitions
  9. We measure and monitor the completeness, accuracy, timeliness and integrity of all material risk data and we have appropriate escalation channels and action plans in place to rectify poor data quality

BCBS 239 is a paradigm shift in Data Quality Risk management thinking.

Note: Major banks outside the top 30 in the world (known as the D-SIBs) have a little more breathing space. They will be required to comply with BCBS 239 within  three years of being designated as a D-SIB by their national supervisor. They have the opportunity to learn for the experience of the first wave.

Socrates “beginning of wisdom” meets “fork handles” British humour

Socrates, the ancient Greek philosopher, tells us that “The beginning of wisdom is the definition of terms”. Perhaps Socrates was the first “Data Steward“, since the first step in data stewardship is also the definition of terms. I’m sure you’ve often seen instances of the confusion caused when the same term is used to describe different things. What does the term “Monthly Sales” mean? Is it the value or the volume of sales? It is impossible to tell. However, it is easy to clarify by adding a simple qualifier:

  1. Monthly Sales Value
  2. Monthly Sales Volume

For a light-hearted look at the confusion that “unqualified” terms can bring, watch the British humour sketch below from “The Two Ronnies” – called “Fork Handles”. See if you can keep a straight face!

Data is the new oil – what grade is yours?

Bill Bryson’s book “One Summer: America 1927” provides a fascinating insight into the world of Aviation in the “roaring 20’s”. Aviators were vying to be the first to cross the Atlantic from New York to Paris, a challenge that took many lives, most of which were European.  

Bryson tells us “The American flyers also had an advantage over their European counterparts that nobody yet understood. They all used aviation fuel from California, which burned more cleanly and gave better mileage. No one knew what made it superior because no one yet understood octane ratings – that would not come until the 1930s – but it was what got most American planes across the ocean while others were lost at sea.

Once octane ratings were understood, fuel quality was measured and lives were saved.

We’ve all heard that data is the new oil. To benefit from this “new oil”, you must ensure you use “top grade” only. It can make the difference between business success and failure. It is also a prerequisite for Regulatory compliance, (Solvency II, FATCA, Dodd Frank, Basel III, BCBS 239 etc.). Thankfully, like octane ratings, we know how to measure data quality using 6 primary dimensions: completeness; validity; accuracy; uniqueness; timeliness and consistency. For more details see my post: Major step forward in Data Quality Measurement.

I also explore this topic in my post Russian Gas Pipe and Data Governance.

What happens in your organisation? Do you measure the quality of your most critical data, or do you fly on a wing and a prayer? Please add your comments below.

Major step forward in Data Quality Measurement

How tall are you?
What is the distance between Paris and Madrid?
How long should one cook a 4.5Kg turkey for – and at what temperature?

Quality data is key to a successful business. To manage data quality, you must measure it


We can answer the above questions thanks to “standard dimensions”:

Height: Metres / Feet
Distance: Kilometres / Miles
Time: Hours & Minutes
Temperature: Degrees Celsius / Farenheit

Life would be impossible without the standard dimensions above, even though the presence of “alternate” standards such as metric Vs Imperial can cause complexity.

We measure things for a reason. Based on the measurements, we can make decisions and take action. Knowing our neck size enables us to decide which shirt size to choose. Knowing our weight and our waist size may encourage us to exercise more and perhaps eat less.

We measure data quality because poor data quality has a negative business impact that affects the bottom line.  Rectifying data quality issues requires more specific measurement than anecdotal evidence that data quality is “less than satisfactory”.

The great news is that 2013 marked a major step forward in the agreement of standard dimensions for data quality measurement.

In October 2013, following an 18 month consultative process DAMA UK published a white paper called DAMA UK DQ Dimensions White Paper R3 7.

The white paper lists 6 standard data quality dimensions and provides worked examples. The 6 are:

1. Completeness
2. Uniqueness
3. Timeliness
4. Validity
5. Accuracy
6. Consistency

The dimensions are not new. I referred to 5 of them in a blog post in 2009 There is little understanding among senior management of what “Data Quality” means.
The good news is that this white paper pulls together the thinking of many DQ professionals and provides a full explanation of the dimensions. More importantly, it emphasises the criticality of assessing the organisational impact of poor data quality. I include a quote below:

“Examples of organisational impacts could include:
• incorrect or missing email addresses would have a significant impact on any marketing campaigns
• inaccurate personal details may lead to missed sales opportunities or a rise in customer complaints
• goods can get shipped to the wrong locations
• incorrect product measurements can lead to significant transportation issues i.e. the product will not fit into a lorry, alternatively too many lorries may have been ordered for the size of the actual load
Data generally only has value when it supports a business process or organisational decision making.”

I would like to thank DAMA UK for publishing this whitepaper. I expect to refer to it regularly in my day to day work. It will help me build upon my thoughts in my blog post Do you know what’s in the data you’re consuming?

Hopefully regulators worldwide will refer to this paper when considering data quality management requirements.

Some excellent articles / blog posts / videos referring to this whitepaper include:

Nicola Askham – Data Quality Dimensions

3-2-1 Start Measuring Data Quality ()

Great Data Debate (2) Danger in Dimensions, Kenneth MacKinnon

How do you expect this paper will affect your work? Please share your thoughts. 

Opportunity to apply lessons learnt in my new job

This week I started a new job as Head of Customer Information at Bank of Ireland in Dublin. I am excited at the prospect of applying the lessons I have learnt for the benefit of our customers.

I would like to take this opportunity to thank my fellow data management professionals worldwide for generously sharing their experience with me. I started to write this blog in 2009. My objective was to “Share my experience and seek to learn from the experience of others”. I have certainly learnt from the experience of others, and I hope to continue to do so.

The opinions I express on this blog will continue to be my own. I look forward to continuing to hear yours.

FSA imposes £2.4million fine for inadequate risk reporting systems

London 18th March 2013 – FSA imposes £2.4million fine for inadequate risk reporting systems, which led to a failure to keep investors informed ahead of a profit warning which wiped 57% off the company’s share price. (See London Evening Standard: “Watchdog gets tougher as oil-rig firm Lamprell is fined £2.4 million over stock market breach“).

Oil services group Lamprell is not a bank. However, Lamprell could have avoided this fine, if they had implemented the new BCBS principles for effective risk data aggregation and risk reporting practices (BCBS 239), as published in January 2013; principles, which I describe in a previous post as Data aggregation and reporting principles – applied common sense

I include below some quotes from the article, and in parentheses, the relevant text from the BCBS 239 principles:

  • The FSA said that monthly reports to the board had been totally inadequate for a company of its size and that such reports were delivered late.”
    (Principle 5: Timeliness. Paragraph 44 “A bank’s risk data aggregation capabilities should ensure that it is able to produce aggregate risk information on a timely basis to meet all risk management reporting requirements.”)
  • “It also said the takeover of a rival in 2011, which doubled Lamprell’s size, had left the company using too many different reporting systems.”
    (Principle 1 Governance. Paragraph 29. A bank’s risk data aggregation capabilities and risk reporting practices should be… Considered as part of any new initiatives, including acquisitions and/or divestitures… When considering a material acquisition, a bank’s due diligence process should assess the risk data aggregation capabilities and risk reporting practices of the acquired entity, as well as the impact on its own risk data aggregation capabilities and risk reporting practices. The impact on risk data aggregation should be considered explicitly by the board and inform the decision to proceed. The bank should establish a timeframe to integrate and align the acquired risk data aggregation capabilities and risk reporting practices within its own framework.)

Tracey McDermott, FSA director of enforcement and financial crime, said: “Lamprell’s systems and controls may have been adequate at an earlier stage, but failed to keep pace with its growth. As a result they were seriously deficient for a listed company of its size and complexity, meaning it was unable to update the market on crucial financial information in a timely manner.”

The moral of the story… ensure your organisation, regardless of your industry, applies the common sense set out in: “Data aggregation and reporting principles (BCBS 239) – applied common sense“.