How to deliver a Single Customer View

How to deliver a Single Customer View

How to cost effectively deliver a Single Customer View

Many have tried, and many have failed to deliver a “Single Customer View”.  Well now it’s a regulatory requirement – at least for UK Deposit Takers (Banks, Building Societies, etc.).

The requirement to deliver a Single Customer View of eligible deposit holders indirectly affects every man, woman and child in the UK.  Their deposits, large or small, are covered by the UK Deposit Guarantee Scheme.  This scheme played a key role in maintaining confidence in the banking system during the dark days of the world financial crisis.

UK Deposit Takers must not only deliver the required Single Customer View data, they must provide clear evidence of the data quality processes and controls they use to deliver and verify the SCV data.

The deadline for compliance is challenging.  Plans must be submitted to the regulator by July 2010, and the SCV must be built and verified by Jan 2011.

To help UK Deposit Takers, I have written an E-book explaining how to cost effectively deliver a Single Customer View.  You may download this free from the Dataqualitypro website:

While the document specifically addresses the UK Financial Services Requirement for a Single Customer View, the process steps will help anyone planning a major data migration / data population project.

If you are in any doubt about the need for good data quality management processes to deliver any new system (e.g. Single Customer View, Solvency II, etc.), read the excellent Phil Simon interview on Dataqualitypro about why new systems fail.

Common Enterprise wide Data Governance Issues – #12. No Enterprise wide Data Dictionary.

This post is one of a series dealing with common Enterprise Wide Data Governance Issues.    Assess the status of this issue in your Enterprise by clicking here:  Data Governance Issue Assessment Process

No Idea What This Means

Anyone know what this acronym means?

An excellent series of blog posts from Phil Wright (Balanced approach to scoring data quality) prompted me to restart this series.  Phil tells us that in his organisation, “a large amount of time and effort has been applied to ensure that the business community has a definitive business glossary, containing all the terminology and business rules that they use within their reporting and business processes. This has been published, and highly praised, throughout the organisation.” I wish other organisations were like Phil’s.

Not only do some organisations lack “a definitive business glossary” as Phil describes above, complete with business rules….
Some organisations have no Enterprise wide Data Dictionary.  What is worse – there is no appreciation within senior management of the need for an Enterprise wide Data Dictionary (and therefore no budget to develop one).

Impact(s):

  • No business definition, or contradictory business definitions of the intended content of critical fields.
  • There is an over dependence on a small number of staff with detailed knowledge of some databases.
  • Incorrect or non-ideal sources of required data are identified – because the source of required data is determined by personnel with expertise in specific systems only.
  • New projects, dependent on existing data, are left ‘flying blind’.  The impact is similar to landing in a foreign city, with no map and not speaking the language.
  • Repeated re-invention of the wheel, duplication of work, with associated costs.

Solution:

CIO to define and implement the following Policy:  (in addition to the policies listed for Data Governance Issue #10):

  • An Enterprise wide Data Dictionary will be developed covering critical Enterprise wide data, in accordance with industry best practice.

Does your organisation have an “Enterprise wide Data Dictionary” – if so, how did you achieve it?  If not, how do new projects that depend on existing data begin the process of locating that data?  Please share your experience.

Plug and Play Data – The future for Data Quality

The excellent IAIDQ World Quality Day webinar looked at what the Data Quality landscape might be like in 5 years time, in 2014.  This got me thinking.  Dylan Jones excellent article on The perils of procrastination made me think some more…

Plug and Play Data

Plug and Play Data

I believe that we data quality professionals need a paradigm shift in the way we think about data.  We need to make “Get data right first time” and  “Data Quality By Design” such no brainers that procrastination is not an option.   We need to promote a vision of the future in which all data is reusable and interchangeable – a world of “Plug and Play Data”.

Everybody, even senior business management, understand the concepts of “plug and play” and reusable play blocks.  For “plug and play” to succeed, interconnecting parts must be complete, fully moulded, and conform to clearly defined standards.  Hence “plug and play data” must be complete, fully populated, and conform to clearly defined standards (business rules).

How can organisations “get it right first time” and create “plug and play data”?
It is now relatively simple to invoke cloud based verification from any part of a system through which data enters.

For example, when opening a new “Student” bank account, cloud based verification might prompt the bank assistant with a message like “Mr. Jones’ date of birth suggests he is 48 years old.  Is his date of birth correct?  Is a “Student Account” appropriate for Mr. Jones”?

In conclusion:

We Data Quality Professionals need to educate both Business and IT on the need for, and the benefits of “plug and play data”.   We need to explain to senior management that data is no longer needed or used by only one application.  We need to explain that even tactical solutions within Lines of Business need to consider Enterprise demands for data such as:

  1. Data feed into regulatory systems (e.g Anti Money Laundering, BASEL II, Solvency II)
  2. Access from or data feed into CRM system
  3. Access from or data feed into Business Intelligence system
  4. Ad hoc provision of data to satisfy regulatory requests
  5. Increasingly – feeds to and from other organisations in the supply chain
  6. Ultimate replacement of application with newer generation system

We must educate the business on the increasingly dynamic information requirements of the Enterprise – which can only be satisfied by getting data “right first time” and by creating “plug and play data” that can be easily reused and interconnected.

What do you think?

Common Enterprise wide Data Governance Issues #11: No ownership of Cross Business Unit business rules

This post is one of a series dealing with common Enterprise Wide Data Governance Issues.  Assess the status of this issue in your Enterprise by clicking here:  Data Governance Issue Assessment Process

Business Units often disagree

I'm right, he's wrong!

Different Business Units sometimes use different business rules to perform the same task.

Withing retail banking for example, Business Unit A might use “Account Type” to distinguish personal accounts from business accounts, while Business Unit B might use “Account Fee Rate”.


Impact(s) can include:

  1. Undercharging of Business Accounts mistakenly identified as Personal Accounts, resulting in loss of revenue.
  2. Overcharging of Personal Accounts mistakenly identified as Business Accounts, which could lead to a fine or other sanctions from the Financial Regulator.
  3. Anti Money Laundering (AML) system generates false alerts on Business Accounts mistakenly identified as Personal Accounts.
  4. AML system fails to generate alert on suspicious activity (e.g. large cash lodgements) on a personal account misidentified as a Business Account, which could lead to a regulatory fine.
  5. Projects dependent on existing data (e.g. AML, CRM, BI) discover that the business rules they require are inconsistent.

Solution:
Agree and implement the following Policy:  (in addition to the policies listed for Data Governance Issue #10)

  • Responsibility for resolving cross business unit business rule discrepancies lies with the Enterprise Data Architect.

For further details on Business rules – see Business Rules Case Study.

Your experience:
Have you faced a situation in which different business units use different business rules?   Please share your experience by posting a comment – Thank you – Ken.

My interview with Dylan Jones

Dylan Jones of DataQualityPro interviews me about the process I use to assess common Enterprise wide data issues. Use this process to assess the status of data governance within your organisation or that of a client.

Data Quality Pro interview with Ken O'Connor Data Consultant

Russian Gas Pipe and Data Governance

As you know, Russia supplies Gas to many European countries.

What's flowing through your critical data pipelines?

Do you know what’s in your critical data pipelines?

Could you imagine Italy purchasing gas from Russia without checking what exactly was flowing through the pipe?  I’m no expert on gas pipelines, but I know that before completing the agreement to purchase the gas, Italy and Russia would have agreed metrics such as:

  • Volume of Gas
  • Calorific value (Energy content)
  • etc.

So what? What else would one expect?  Applied common sense… yes?

Why is it that such common sense is often lacking in Data Migration and Data Population projects?  Why do some Enterprises continue to perform data population of, and ongoing data entry to, critical data repositories without fully understanding the data they are pumping into the repository?

A simple example involves Date of Birth.  The business ask the IT function to populate Date of Birth in the new AML / BASEL II / CRM / other repository. Some time later, when data population is complete, the business begin to express concerns:

  • “We never realised we had so many customers aged over 100 ???”
  • “I thought we had more Student customers”
  • “How come so many of our customers share the same birthday ?”
  • “These are not the results we expected”
  • etc.

Performing data population on the basis of what the source data “should contain”, without analysing what exactly it does contain is known as ‘Load and Explode’ approach to Data Population.  I cover this Enterprise Wide Data Issue in more detail here.

We in the “Data Governance”, “Data Quality” industry need to educate the business community on the “common sense” parts of data governance, and the need to engage “Data Governance Professionals”  to ensure that “Data Quality Common Sense” is actually applied.

Feedback welcome – Ken