Automate or Obliterate, That is the Question

March 18, 2015

Originally posted on Liliendahl on Data Quality:

Back in 1990 Michael Hammer made a famous article called Reengineering Work: Don’t Automate, Obliterate.

Indeed, while automation is a most wanted outcome of Master Data Management (MDM) implementations and many other IT enabled initiatives, you should always consider the alternative being eliminating (or simplifying). This often means thinking out of the box.

As an example I today stumbled upon the Wikipedia explanation about Business Process Mapping. The example used is how to make breakfast (the food part):

Makebreakfast

You could think about different Business Process Re-engineering opportunities for that process. But you could also realize that this is an English / American breakfast. What about making a French breakfast instead. Will be as simple as:

Input money > Buy croissant > Fait accompli

PS: From the data quality and MDM world one example of making French breakfast instead of English / American breakfast is examined in the post The…

View original 9 more words


Socrates “beginning of wisdom” meets “fork handles” British humour

January 14, 2015

Socrates, the ancient Greek philosopher, tells us that “The beginning of wisdom is the definition of terms”. Perhaps Socrates was the first “Data Steward“, since the first step in data stewardship is also the definition of terms. I’m sure you’ve often seen instances of the confusion caused when the same term is used to describe different things. What does the term “Monthly Sales” mean? Is it the value or the volume of sales? It is impossible to tell. However, it is easy to clarify by adding a simple qualifier:

  1. Monthly Sales Value
  2. Monthly Sales Volume

For a light-hearted look at the confusion that “unqualified” terms can bring, watch the British humour sketch below from “The Two Ronnies” – called “Fork Handles”. See if you can keep a straight face!


Santa’s secret tips for a successful “Secret Santa”

December 21, 2014

If you’ve ever organised a “Secret Santa” you’ll know that “data quality” is critical to its smooth running. Santa is the acknowledged world leader in data quality management, given his success managing the names and addresses of billions of children worldwide. He coined the data quality industry motto “Make a list, then check it twice”, which is a Critical Success Factor (CSF) to his “Naughty” and “Nice” segmentation process. Santa Claus

Santa has kindly shared some of his secret tips… In risk management terms, he tells us that we need to “manage the risk that the critical data required for the success of the (Secret Santa) programme is not fit for purpose”

He suggests that we apply 4 of his 6 favourite data quality dimensions:

  1. Completeness: Ensure you put a name on your gift
  2. Accuracy: Ensure you put the correct (accurate)name on your gift (Check against the slip of paper you pulled out)
  3. Uniqueness: Ensure you put First Name and Surname on your gift (just in case there are two Johns, or Marvins or Oprahs)
  4. Timeliness: Ensure you deliver your gift, with its associated critical data, to the secret santa organiser in good time

Remember, Santa “knows if you’ve been bad or good”, it’s central to his “Data Governance programme”…. So please… be good for goodness sake

Happy Christmas!

Ken


Data is the new oil – what grade is yours?

February 25, 2014

Bill Bryson’s most recent book “One Summer: America 1927″ provides a fascinating insight into the world of Aviation in the “roaring 20’s”. Aviators were vying to be the first to cross the Atlantic from New York to Paris, a challenge that took many lives, most of which were European.  

Bryson tells us “The American flyers also had an advantage over their European counterparts that nobody yet understood. They all used aviation fuel from California, which burned more cleanly and gave better mileage. No one knew what made it superior because no one yet understood octane ratings – that would not come until the 1930s – but it was what got most American planes across the ocean while others were lost at sea.

Once octane ratings were understood, fuel quality was measured and lives were saved.

We’ve all heard that data is the new oil. To benefit from this “new oil”, you must ensure you use “top grade” only. It can make the difference between business success and failure. It is also a prerequisite for Regulatory compliance, (Solvency II, FATCA, Dodd Frank, Basel III etc.). Thankfully, like octane ratings, we know how to measure data quality using 6 primary dimensions: completeness; validity; accuracy; uniqueness; timeliness and consistency. For more details see my post: Major step forward in Data Quality Measurement.

I also explore this topic in my post Russian Gas Pipe and Data Governance.

What happens in your organisation? Do you measure the quality of your most critical data, or do you fly on a wing and a prayer? Please add your comments below.


Major step forward in Data Quality Measurement

January 2, 2014

How tall are you?
What is the distance between Paris and Madrid?
How long should one cook a 4.5Kg turkey for – and at what temperature?

Quality data is key to a successful business. To manage data quality, you must measure it


We can answer the above questions thanks to “standard dimensions”:

Height: Metres / Feet
Distance: Kilometres / Miles
Time: Hours & Minutes
Temperature: Degrees Celsius / Farenheit

Life would be impossible without the standard dimensions above, even though the presence of “alternate” standards such as metric Vs Imperial can cause complexity.

We measure things for a reason. Based on the measurements, we can make decisions and take action. Knowing our neck size enables us to decide which shirt size to choose. Knowing our weight and our waist size may encourage us to exercise more and perhaps eat less.

We measure data quality because poor data quality has a negative business impact that affects the bottom line.  Rectifying data quality issues requires more specific measurement than anecdotal evidence that data quality is “less than satisfactory”.

The great news is that 2013 marked a major step forward in the agreement of standard dimensions for data quality measurement.

In October 2013, following an 18 month consultative process DAMA UK published a white paper called DAMA UK DQ Dimensions White Paper R3 7.

The white paper lists 6 standard data quality dimensions and provides worked examples. The 6 are:

1. Completeness
2. Uniqueness
3. Timeliness
4. Validity
5. Accuracy
6. Consistency

The dimensions are not new. I referred to 5 of them in a blog post in 2009 There is little understanding among senior management of what “Data Quality” means.
The good news is that this white paper pulls together the thinking of many DQ professionals and provides a full explanation of the dimensions. More importantly, it emphasises the criticality of assessing the organisational impact of poor data quality. I include a quote below:

“Examples of organisational impacts could include:
• incorrect or missing email addresses would have a significant impact on any marketing campaigns
• inaccurate personal details may lead to missed sales opportunities or a rise in customer complaints
• goods can get shipped to the wrong locations
• incorrect product measurements can lead to significant transportation issues i.e. the product will not fit into a lorry, alternatively too many lorries may have been ordered for the size of the actual load
Data generally only has value when it supports a business process or organisational decision making.”

I would like to thank DAMA UK for publishing this whitepaper. I expect to refer to it regularly in my day to day work. It will help me build upon my thoughts in my blog post Do you know what’s in the data you’re consuming?

Hopefully regulators worldwide will refer to this paper when considering data quality management requirements.

Some excellent articles / blog posts / videos referring to this whitepaper include:

Nicola Askham – Data Quality Dimensions

3-2-1 Start Measuring Data Quality ()

Great Data Debate (2) Danger in Dimensions, Kenneth MacKinnon

How do you expect this paper will affect your work? Please share your thoughts. 


Opportunity to apply lessons learnt in my new job

April 13, 2013

This week I started a new job as Head of Customer Information at Bank of Ireland in Dublin. I am excited at the prospect of applying the lessons I have learnt for the benefit of our customers.

I would like to take this opportunity to thank my fellow data management professionals worldwide for generously sharing their experience with me. I started to write this blog in 2009. My objective was to “Share my experience and seek to learn from the experience of others”. I have certainly learnt from the experience of others, and I hope to continue to do so.

The opinions I express on this blog will continue to be my own. I look forward to continuing to hear yours.


FSA imposes £2.4million fine for inadequate risk reporting systems

March 18, 2013

London 18th March 2013 – FSA imposes £2.4million fine for inadequate risk reporting systems, which led to a failure to keep investors informed ahead of a profit warning which wiped 57% off the company’s share price. (See London Evening Standard: “Watchdog gets tougher as oil-rig firm Lamprell is fined £2.4 million over stock market breach“).

Oil services group Lamprell is not a bank. However, Lamprell could have avoided this fine, if they had implemented the new BCBS principles for effective risk data aggregation and risk reporting practices, as published in January 2013; principles, which I describe in a previous post as Data aggregation and reporting principles – applied common sense

I include below some quotes from the article, and in parentheses, the relevant text from the BCBS principles:

  • The FSA said that monthly reports to the board had been totally inadequate for a company of its size and that such reports were delivered late.”
    (Principle 5: Timeliness. Paragraph 44 “A bank’s risk data aggregation capabilities should ensure that it is able to produce aggregate risk information on a timely basis to meet all risk management reporting requirements.”)

  • “It also said the takeover of a rival in 2011, which doubled Lamprell’s size, had left the company using too many different reporting systems.”
    (Principle 1 Governance. Paragraph 29. A bank’s risk data aggregation capabilities and risk reporting practices should be… Considered as part of any new initiatives, including acquisitions and/or divestitures… When considering a material acquisition, a bank’s due diligence process should assess the risk data aggregation capabilities and risk reporting practices of the acquired entity, as well as the impact on its own risk data aggregation capabilities and risk reporting practices. The impact on risk data aggregation should be considered explicitly by the board and inform the decision to proceed. The bank should establish a timeframe to integrate and align the acquired risk data aggregation capabilities and risk reporting practices within its own framework.)

Tracey McDermott, FSA director of enforcement and financial crime, said: “Lamprell’s systems and controls may have been adequate at an earlier stage, but failed to keep pace with its growth. As a result they were seriously deficient for a listed company of its size and complexity, meaning it was unable to update the market on crucial financial information in a timely manner.”

The moral of the story… ensure your organisation, regardless of your industry, applies the common sense set out in: “Data aggregation and reporting principles – applied common sense“.

 


Follow

Get every new post delivered to your Inbox.

Join 3,747 other followers

%d bloggers like this: