If Carlsberg did Data Governance

In this part of the world, we’re treated to wonderful TV ads from Carlsberg, with the theme “If Carlsberg did…, it would probably be the best…. in the world”. One of my favourites is “If Carlsberg did Haircuts”.

This led me to think, what if Carlsberg did Data Governance?
Picture the scene… Your CEO is looking for a new report, and she has tasked you with providing it, by close of business tomorrow! Where do you start?

In steps the “Data Waiter”, who presents you with a complete menu of the data in your organisation, suggests the data required for the new report and then prompts you with friendly questions like:

  • How would you like your data sliced?
    Would you like it by Geography, or Business line? Perhaps by Product, or by Customer Type?
  • What time period would you like your data to cover?
    For a light snack, I can recommend a “Point in Time” snapshot? Or perhaps you would like to include the past month? I can recommend the house special, which is a “Trend over time” for the past year.
  • How up to date would you like your data to be?
    The early-bird menu has some lovely data we captured 2 years ago. For a $10 supplement you can have data from 1 year ago. On the a la carte menu, you can choose from a wide range, from 6 months old to near-real-time.
  • How often would you like your data?
    Would you prefer a once-off or perhaps a weekly / monthly data extract? We do a lovely daily extract, or perhaps you would like real-time data-streaming?
  • What level of trust does your CEO need in the report you’re preparing?
    The early-bird menu has a fresh slice of big data. It’s a beautiful visualisation that looks really pretty – your CEO will love it. I’ve been assured that the data was so big that there’s no need to worry about the quality of it. (Editor note: Beware of Big Data Bullshit – Look up “Veracity” which is the critical, but often overlooked, 4th “V” of Big Data).
    If your CEO needs a higher level of trust in your report, we have a complete selection of data that we’ve traced from data entry to our own reporting data warehouse, complete with data quality metrics along the data supply chain.

Having selected the data you need, the data waiter scans your retina, confirms you have appropriate access authority, and then delivers the data to your preferred location. You prepare your report and your CEO is so delighted that she promotes you to the senior management team… Happy days! Scene ends.

What services would you like from “The best Data Governance in the world”?

For more about “Trust in data” see my blog post “The growing demand for food and data provenance“.

This article originally appeared on LinkedIn Pulse

Gravitational Collapse in the PIM Space

All organisations have critical business processes. In today’s data driven world, business success depends on the quality of data that underpins your critical business processes, and flows between them.

On 12th Feb 2016, the UK Prudential Regulatory Authority (PRA) published a “Data Review” of how 50 firms in the Insurance Industry assess the quality of data that underpins the validity and integrity of one of their most important business processes, their Solvency II Internal Model”.

Regardless of the industry you are in, I recommend that you read the review. It contains examples of good practice, but also many “war stories” showing where firms got it wrong. One recurring theme is the failure of firms to realise the dependency between upstream and downstream business processes, and the data flow between them.

On 21st Feb 2016, I read an excellent post by Henrik Lilliendahl that I have re-blogged here. Henrik’s post deals with Product Information Management (PIM) and is completely independent of the Insurance Industry or Solvency II. Yet Henrik has summarised brilliantly the relationship between upstream and downstream business processes (business partners), and the need for a “Data Broker Service” between them.

Liliendahl on Data Quality

The previous post on this blog was called Gravitational Waves in the MDM World. Building further on space science, I would like to use the concept of gravitational collapse, which is the process that happens when a star or other space object is born. In this process, a myriad of smaller objects are gathered into a more dense object.

PIM (Product Information Management) is part of the larger MDM (Master Data Management) world. PIM solutions offered today serves very well the requirements for organizing and supporting the handling of product information inside each organization.

However, there is an instability when observing two trading partners. Today, the most common mean to share product data is to exchange one or several spreadsheets with product identification and product attributes (sometimes also called properties or features). Such spreadsheets may also contain links to digital assets being product images, line drawing documents, installation…

View original post 554 more words

BCBS 239 compliance D-Day – Data Quality Risk Checklist

It’s that time of year again, when Santa Claus, the original Data Quality Risk Manager, makes his list and checks it twice.

Risk Signpost

BCBS 239 requires Data Quality Risk to be included in a bank’s overall Risk Framework 

For the largest 30 banks in the world (known as G-SIBs), teams of experts are making final preparations ahead of the BCBS 239 compliance D-Day, which is 1st Jan 2016.

Based on the BCBS 239 document, I’ve put together a “Data Quality Risk Check-list”, that the bank’s board and senior management should sign off, after checking twice:

  1. We have updated our Risk Management Framework to include the identification, assessment and management of data quality risks
  2. We consider risk data accuracy requirements analogous to accounting materiality
  3. Our data quality risk controls surrounding risk data are as robust as those we apply to accounting data
  4. We reconcile our risk data with our sources, including our accounting data sources where appropriate, to ensure that our risk data is accurate
  5. We’ve established data taxonomies across the banking group, which includes information on the characteristics of the data (metadata), as well as use of single identifiers and/or unified naming conventions for data including legal entities, counterparties, customers and accounts
  6. We have defined our data consistently across our organisation and we hold the concepts we use and our data definitions in a “dictionary”
  7. We’ve established roles and responsibilities as they relate to the ownership and quality of risk data and information
  8. Our business owners ensure that data is correctly entered by the relevant front office unit (at source), kept current and aligned with the data definitions
  9. We measure and monitor the completeness, accuracy, timeliness and integrity of all material risk data and we have appropriate escalation channels and action plans in place to rectify poor data quality

BCBS 239 is a paradigm shift in Data Quality Risk management thinking.

Note: Major banks outside the top 30 in the world (known as the D-SIBs) have a little more breathing space. They will be required to comply with BCBS 239 within  three years of being designated as a D-SIB by their national supervisor. They have the opportunity to learn for the experience of the first wave.

How my passion can help you

Does your data tick the right boxes?

Would you base important business decisions on data you knew to be incomplete, inappropriate or inaccurate? Of course not. But how confident are you that your business critical data is as complete, appropriate and accurate as you need it be?

I’m passionate about the benefits that applying common sense data quality management brings to the end to end data supply chain.

I help organisations to maximise the business value they derive from their business critical data. I help identify which data generates the most value. I then help them to measure the quality of critical data, and to correct data quality problems that are impacting revenue, costs, customer experience or regulatory compliance.

Data Quality Management presents the same challenges to all businesses in all industries. The challenges include:

What is our most critical data?
Where is the data, where does it come from?
Who owns the data?
What should it contain?
What does it contain?
What should we do if our most critical data does not contain what it should?
I have faced and solved the above challenges and more.

Contact me today to see how I may be of help.

Automate or Obliterate, That is the Question

Liliendahl on Data Quality

Back in 1990 Michael Hammer made a famous article called Reengineering Work: Don’t Automate, Obliterate.

Indeed, while automation is a most wanted outcome of Master Data Management (MDM) implementations and many other IT enabled initiatives, you should always consider the alternative being eliminating (or simplifying). This often means thinking out of the box.

As an example I today stumbled upon the Wikipedia explanation about Business Process Mapping. The example used is how to make breakfast (the food part):


You could think about different Business Process Re-engineering opportunities for that process. But you could also realize that this is an English / American breakfast. What about making a French breakfast instead. Will be as simple as:

Input money > Buy croissant > Fait accompli

PS: From the data quality and MDM world one example of making French breakfast instead of English / American breakfast is examined in the post The…

View original post 9 more words

Socrates “beginning of wisdom” meets “fork handles” British humour

Socrates, the ancient Greek philosopher, tells us that “The beginning of wisdom is the definition of terms”. Perhaps Socrates was the first “Data Steward“, since the first step in data stewardship is also the definition of terms. I’m sure you’ve often seen instances of the confusion caused when the same term is used to describe different things. What does the term “Monthly Sales” mean? Is it the value or the volume of sales? It is impossible to tell. However, it is easy to clarify by adding a simple qualifier:

  1. Monthly Sales Value
  2. Monthly Sales Volume

For a light-hearted look at the confusion that “unqualified” terms can bring, watch the British humour sketch below from “The Two Ronnies” – called “Fork Handles”. See if you can keep a straight face!

Santa’s secret tips for a successful “Secret Santa”

If you’ve ever organised a “Secret Santa” you’ll know that “data quality” is critical to its smooth running. Santa is the acknowledged world leader in data quality management, given his success managing the names and addresses of billions of children worldwide. He coined the data quality industry motto “Make a list, then check it twice”, which is a Critical Success Factor (CSF) to his “Naughty” and “Nice” segmentation process. Santa Claus

Santa has kindly shared some of his secret tips… In risk management terms, he tells us that we need to “manage the risk that the critical data required for the success of the (Secret Santa) programme is not fit for purpose”

He suggests that we apply 4 of his 6 favourite data quality dimensions:

  1. Completeness: Ensure you put a name on your gift
  2. Accuracy: Ensure you put the correct (accurate)name on your gift (Check against the slip of paper you pulled out)
  3. Uniqueness: Ensure you put First Name and Surname on your gift (just in case there are two Johns, or Marvins or Oprahs)
  4. Timeliness: Ensure you deliver your gift, with its associated critical data, to the secret santa organiser in good time

Remember, Santa “knows if you’ve been bad or good”, it’s central to his “Data Governance programme”…. So please… be good for goodness sake

Happy Christmas!