Data is the new oil – what grade is yours?

February 25, 2014

Bill Bryson’s most recent book “One Summer: America 1927″ provides a fascinating insight into the world of Aviation in the “roaring 20′s”. Aviators were vying to be the first to cross the Atlantic from New York to Paris, a challenge that took many lives, most of which were European.  

Bryson tells us “The American flyers also had an advantage over their European counterparts that nobody yet understood. They all used aviation fuel from California, which burned more cleanly and gave better mileage. No one knew what made it superior because no one yet understood octane ratings – that would not come until the 1930s – but it was what got most American planes across the ocean while others were lost at sea.

Once octane ratings were understood, fuel quality was measured and lives were saved.

We’ve all heard that data is the new oil. To benefit from this “new oil”, you must ensure you use “top grade” only. It can make the difference between business success and failure. It is also a prerequisite for Regulatory compliance, (Solvency II, FATCA, Dodd Frank, Basel III etc.). Thankfully, like octane ratings, we know how to measure data quality using 6 primary dimensions: completeness; validity; accuracy; uniqueness; timeliness and consistency. For more details see my post: Major step forward in Data Quality Measurement.

I also explore this topic in my post Russian Gas Pipe and Data Governance.

What happens in your organisation? Do you measure the quality of your most critical data, or do you fly on a wing and a prayer? Please add your comments below.


Major step forward in Data Quality Measurement

January 2, 2014

How tall are you?
What is the distance between Paris and Madrid?
How long should one cook a 4.5Kg turkey for – and at what temperature?

Quality data is key to a successful business. To manage data quality, you must measure it


We can answer the above questions thanks to “standard dimensions”:

Height: Metres / Feet
Distance: Kilometres / Miles
Time: Hours & Minutes
Temperature: Degrees Celsius / Farenheit

Life would be impossible without the standard dimensions above, even though the presence of “alternate” standards such as metric Vs Imperial can cause complexity.

We measure things for a reason. Based on the measurements, we can make decisions and take action. Knowing our neck size enables us to decide which shirt size to choose. Knowing our weight and our waist size may encourage us to exercise more and perhaps eat less.

We measure data quality because poor data quality has a negative business impact that affects the bottom line.  Rectifying data quality issues requires more specific measurement than anecdotal evidence that data quality is “less than satisfactory”.

The great news is that 2013 marked a major step forward in the agreement of standard dimensions for data quality measurement.

In October 2013, following an 18 month consultative process DAMA UK published a white paper called DAMA UK DQ Dimensions White Paper R3 7.

The white paper lists 6 standard data quality dimensions and provides worked examples. The 6 are:

1. Completeness
2. Uniqueness
3. Timeliness
4. Validity
5. Accuracy
6. Consistency

The dimensions are not new. I referred to 5 of them in a blog post in 2009 There is little understanding among senior management of what “Data Quality” means.
The good news is that this white paper pulls together the thinking of many DQ professionals and provides a full explanation of the dimensions. More importantly, it emphasises the criticality of assessing the organisational impact of poor data quality. I include a quote below:

“Examples of organisational impacts could include:
• incorrect or missing email addresses would have a significant impact on any marketing campaigns
• inaccurate personal details may lead to missed sales opportunities or a rise in customer complaints
• goods can get shipped to the wrong locations
• incorrect product measurements can lead to significant transportation issues i.e. the product will not fit into a lorry, alternatively too many lorries may have been ordered for the size of the actual load
Data generally only has value when it supports a business process or organisational decision making.”

I would like to thank DAMA UK for publishing this whitepaper. I expect to refer to it regularly in my day to day work. It will help me build upon my thoughts in my blog post Do you know what’s in the data you’re consuming?

Hopefully regulators worldwide will refer to this paper when considering data quality management requirements.

Some excellent articles / blog posts / videos referring to this whitepaper include:

Nicola Askham – Data Quality Dimensions

3-2-1 Start Measuring Data Quality ()

Great Data Debate (2) Danger in Dimensions, Kenneth MacKinnon

How do you expect this paper will affect your work? Please share your thoughts. 


Opportunity to apply lessons learnt in my new job

April 13, 2013

This week I started a new job as Head of Customer Information at Bank of Ireland in Dublin. I am excited at the prospect of applying the lessons I have learnt for the benefit of our customers.

I would like to take this opportunity to thank my fellow data management professionals worldwide for generously sharing their experience with me. I started to write this blog in 2009. My objective was to “Share my experience and seek to learn from the experience of others”. I have certainly learnt from the experience of others, and I hope to continue to do so.

The opinions I express on this blog will continue to be my own. I look forward to continuing to hear yours.


FSA imposes £2.4million fine for inadequate risk reporting systems

March 18, 2013

London 18th March 2013 – FSA imposes £2.4million fine for inadequate risk reporting systems, which led to a failure to keep investors informed ahead of a profit warning which wiped 57% off the company’s share price. (See London Evening Standard: “Watchdog gets tougher as oil-rig firm Lamprell is fined £2.4 million over stock market breach“).

Oil services group Lamprell is not a bank. However, Lamprell could have avoided this fine, if they had implemented the new BCBS principles for effective risk data aggregation and risk reporting practices, as published in January 2013; principles, which I describe in a previous post as Data aggregation and reporting principles – applied common sense

I include below some quotes from the article, and in parentheses, the relevant text from the BCBS principles:

  • The FSA said that monthly reports to the board had been totally inadequate for a company of its size and that such reports were delivered late.”
    (Principle 5: Timeliness. Paragraph 44 “A bank’s risk data aggregation capabilities should ensure that it is able to produce aggregate risk information on a timely basis to meet all risk management reporting requirements.”)

  • “It also said the takeover of a rival in 2011, which doubled Lamprell’s size, had left the company using too many different reporting systems.”
    (Principle 1 Governance. Paragraph 29. A bank’s risk data aggregation capabilities and risk reporting practices should be… Considered as part of any new initiatives, including acquisitions and/or divestitures… When considering a material acquisition, a bank’s due diligence process should assess the risk data aggregation capabilities and risk reporting practices of the acquired entity, as well as the impact on its own risk data aggregation capabilities and risk reporting practices. The impact on risk data aggregation should be considered explicitly by the board and inform the decision to proceed. The bank should establish a timeframe to integrate and align the acquired risk data aggregation capabilities and risk reporting practices within its own framework.)

Tracey McDermott, FSA director of enforcement and financial crime, said: “Lamprell’s systems and controls may have been adequate at an earlier stage, but failed to keep pace with its growth. As a result they were seriously deficient for a listed company of its size and complexity, meaning it was unable to update the market on crucial financial information in a timely manner.”

The moral of the story… ensure your organisation, regardless of your industry, applies the common sense set out in: “Data aggregation and reporting principles – applied common sense“.

 


The growing demand for food and data provenance

February 26, 2013

Last November, I presented in London at the Data Management and Information Quality Europe 2012 conference. My presentation was called Do you know what’s in the data you’re consuming.

In the presentation, I compare the data supply chain with the food supply chain.

I believe that data consumers have the right to be provided with facts about the content of the data they are consuming, just as food consumers are provided with facts about the food they are buying. The presentation provides guidelines on how you can improve your data supply chain.

Little did I realise in November that within 3 months the term “provenance” would be hitting the headlines as the European horsemeat scandal grows.

There’s a silver lining in this food scandal for data quality management professionals. As financial regulators increasingly demand evidence of the provenance of the data provided to them, it is now easier for data quality management professionals to explain to their business colleagues and senior management what “data provenance” means, and what it requires.  Retailers, such as Tesco, must have controls in their supply chain that ensure that the food they sell to consumers only contains “what it says on the tin”. Similarly, financial services organisations providing data to financial regulators must have controls in their data supply chain that ensure the quality of the data they provide can be trusted. Regulators are now asking financial services organisations to demonstrate evidence that their data supply chain can be trusted. They require organisations to demonstrate evidence of their data provenance, as applied to their critical or material data.

But what exactly is “data provenance”? The best definition I have seen comes from Michael Brackett in his excellent book “Data Resource Simplexity“.

“Data Provenance is provenance applied to the organisation’s data resource. The data provenance principle states that the source of data, how the data were captured, the meaning of the data when they were first captured, where the data were stored, the path of those data to the current location, how the data were moved along that path, and how those data were altered along that path must be documented to ensure the authenticity of those data and their appropriateness for supporting the business”.

Enjoy your “beef” burger!


The link between horse meat in beef burgers and data quality management

January 16, 2013

It was reported today that Horse DNA was detected in tests performed on frozen “beef” burgers in a number of UK and Irish supermarkets. This has come as a shock to consumers, who assumed that quality controls were in place to ensure that food contains only what it says on the label. It appears the quality controls did not include a specific test for the presence of horse meat.

Photograph: Matt Cardy/Getty Images

In an earlier blog post, I asked “Do you know what’s in the data you’re consuming?” In that post, I proposed that, as data consumers, we have the right to expect facts about the business critical data we consume – just as food consumers are provided with nutritional facts. Today’s news reminds us to be clear about the data quality facts we ask for. 

The old adage applies “If you don’t measure, you can’t manage”.


Basel Committee issues “Principles for effective risk data aggregation and risk reporting – final document”

January 9, 2013

Today, 9th Jan 2013, the Basel Committee on Banking Supervision issued a press release announcing the final document “Principles for effective risk data aggregation and risk reporting.

I wrote two posts on the consultation paper, when it was issued in June 2012.

1. Data aggregation and reporting principles – applied common sense

2. Risk data aggregation and risk reporting – Board and senior management responsibilities

I’m pleased to see that the final document retains the applied common sense approach set out in the June 2012 version. Working together with other independent data professionals, I provided feedback to the Basel Committee on the June consultation paper. Our feedback, together with that from banking organisations worldwide was also published today – See feedback and comments. The final document has taken on board the feedback from all of the contributors and provided additional clarification where requested. For example, Annex 1 has been expanded considerably to explain all of the “Terms used in the document”.

Banks need to start working on compliance plans immediately. To quote paragraph 87 of the docment “National banking supervisors will start discussing implementation of the Principles with G-SIB’s senior management in early 2013. This will ensure that banks develop a strategy to meet the Principles by 2016.”

While compliance with the principles is mandatory for G-SIBS (Globally Systemically Important Banks), the document makes it clear that all banks should implement the principles. Paragraph 15 states “It is strongly suggested that national supervisors also apply these Principles to banks identified as D-SIBs (Domestic systemically important banks) by their national supervisors three years after their designation as D-SIBs.  

As I point out in Data aggregation and reporting principles – applied common sense, all organisations in all industries would benefit by applying these principles. Simply remove the word “Risk”, and one has “Principles for effective data aggregation and reporting”.

What do you think? Please post your views below. Join the debate.


Follow

Get every new post delivered to your Inbox.

Join 2,372 other followers

%d bloggers like this: