Opportunity to apply lessons learnt in my new job

This week I started a new job as Head of Customer Information at Bank of Ireland in Dublin. I am excited at the prospect of applying the lessons I have learnt for the benefit of our customers.

I would like to take this opportunity to thank my fellow data management professionals worldwide for generously sharing their experience with me. I started to write this blog in 2009. My objective was to “Share my experience and seek to learn from the experience of others”. I have certainly learnt from the experience of others, and I hope to continue to do so.

The opinions I express on this blog will continue to be my own. I look forward to continuing to hear yours.

The Queen’s Speech and Data Governance

The Queen of England made an historic and welcome visit to Ireland in 2011.  She delivered a memorable speech at the Irish State banquet, in which she said “With the benefit of historical hindsight, we can all see things which we wish had been done differently, or not at all”.

In real life, we cannot change the past.  The same does not apply to data created in the past.  Regulators now expect financial institutions to:

  • Identify data quality mistakes made in the past
  • Correct material mistakes
  • Implement data governance controls to prevent recurrences

I quote from the UK Financial Regulator’s requirement that all deposit holding financial institutions deliver a single customer view (SCV) of deposit holders:  “There may be a number of reasons why SCV data is not 100% accurate. This might be due to defects in the systems used to compile the SCV, but we would expect such defects to be picked up and rectified during the course of the systems’ development.”

Dodd-Frank, Solvency II, FATCA, BASEL III and many more regulations all require similar. Use this checklist to check if your organisation suffers any common Enterprise-Wide Data Governance Issues.

What data quality mistakes have you uncovered from the past, and how have you corrected them? I’d love to hear about them.

What is your undertaking-wide common understanding of data quality?

Do you have an undertaking-wide common understanding of data quality?  If not – I suggest you read on…

When a serious “data” problem arises in your organisation, how is it discussed? (By “serious”, I mean a data problem that has, or could cost so much money that it has come to the attention of the board).

What Data Quality KPIs does your board request, or receive to enable the board members understand the problem with the quality of the data? What data quality controls does your board expect to be in place to ensure that critical data is complete, appropriate and accurate?

If your board has delegated authority to a data governance committee, what is the data governance committee’s understanding of “Data Quality”?  Is it shared across your organisation?  Do you all speak the same language, and use the same terminology when discussing “Data Quality”?  In brief – are you all singing from the same “Data Quality Hymn Sheet”?

Why do I ask?

Solvency II – What is your undertaking wide common understanding of Data Quality?

For the first time, a regulator has stated that organisations must have an “undertaking-wide common understanding of data quality”.

Solvency II requires insurance organisations to demonstrate the data underpinning their solvency calculations are as complete, appropriate and accurate as possible.  The guidance from the regulator goes further than that.

CP 56, paragraph 5.178 states:  “Based on the criteria of “accuracy”, “completeness” and “appropriateness”… the undertaking shall further specify its own concept of data quality.  Provided that undertaking-wide there is a common understanding of data quality, the undertaking shall also define the abstract concept of data quality in relation to the various types of data in use… The undertaking shall eventually assign to the different data sets specific qualitative and/or quantitative criteria which, if satisfied, qualify them for use in the internal model.”

Business Requirements should be clear, measurable and testable. Unfortunately, the SII regulator uses complex language, that make SII Data Quality Management and Governance requirements wooly, ambiguous and open to interpretation.  My interpretation of the guidance is that the regulator will expect you to demonstrate your “undertaking-wide common understanding of data quality”.  

What might a common understanding of data quality look like?

Within the Data Quality industry, commonly used dimensions of data quality include.

  • Completeness
    Is the data populated ?
  • Validity
    Is the data within the permitted range of values ?
  • Accuracy
    Does the data represent reality or a verifiable source ?
  • Consistency
    Is the same data consistent across different files/tables ?
  • Timeliness
    Is the data available when needed ?
  • Accessibility
    Is the data easily accessible, understandable and usable ?

Little did I know at the time I wrote the above blog post that a regulator would soon require organisations to demonstrate their understanding of data quality, and demonstrate that it is shared “undertaking wide”.

How might you demonstrate that your understanding of data quality is “undertaking-wide” and “common”?

You could demonstrate that multiple “data dependent” processes have a shared understanding of data quality (processes such as CRM, Anti Money Laundering, Anti Fraud, Single View of Customer etc.)

In the UK, the Pensions Regulator (tPR) has issued record keeping requirements which requires pensions companies to measure and manage the quality of their schemes data.  I believe the Solvency II “independent third party” will at least expect to see a common understanding of data quality shared between Solvency II and tPR programmes.  

What do you think? Please share…

Charter of Data Consumer rights and responsibilities

Time for charter of Data Consumer rights and responsibilities

There are many rights enshrined in law that benefit all of us. One example is the UN Charter of Human Rights.  Another example is the “Consumer Rights” protection most countries enforce to guarantee us, the buying public, the right to expect goods and services that are of good quality and “fit for purpose”.  As buyers of goods and services, we also have responsibilities.  If you or I buy a “Rolex watch” for $10 from a casual street vendor, we cannot claim consumer protection rights if the watch stops working within a week. “Let the buyer beware” or “Caveat Emptor” is the common sense responsibility that we, as consumers must observe.

I have previously written about business users’ right to expect good data plumbing. Business users (of data) have responsibilities also.  I believe its time to agree a charter of rights and responsibilities for them.  Business users of data are “Data Consumers” – people who use data to perform their work, whatever work that may be.  Data Consumers make decisions based on the data or information available to them. Examples can range from a doctor prescribing medication based on the information in a patient’s health records, to a multi-national chief executive deciding to buy a business based on the performance figures available, to an actuary developing an internal model to determine Solvency II Capital Requirements.

What rights and responsibilities should data consumers have?

Here’s my starter set:

  • The right to expect data that is “fit for purpose”, data that is complete, appropriate and accurate.
  • The responsibility to define what “fit for purpose” data means to them.
  • The right to expect guidance and assistance in defining what constitutes complete, appropriate and accurate data for them.
  • The responsibility to explain the impact that “sub-standard” data would have on the work they do.
  • The right to be informed of the actual quality of the data they use.
  • The right to expect controls in place that verify the quality of the data they use meets the standard they require.

What do you think ? Please feedback your suggestions:

How to deal with Gobbledygook requirements

In my last post I had a bit of a rant about the Gobbledygook “eligibility requirements” provided by the UK Financial Services Compensation Scheme.

The reality is that business requirements can come from many places, they are often vague, and often overly complex.  They are often imposed on you from outside, as in the case of regulatory requirements, like the UK regulatory requirement to deliver a Single Customer View.

So… life can be tough – you have to get on with it, and deal with “less than perfect” requirements.

Well defined requirements are clear, measurable and testable.  You cannot expect business experts to be expert in defining requirements. As a Data Quality professional, one of your roles is to work with business experts to clarify and simplify their requirements.

Let us look at the “eligibility requirements” provided by the UK Financial Services Compensation Scheme

In essence, some customer types are eligible for compensation, while others are not.  You must use your “parsing skills” to parse the overly complex rules – to “sort the apples from the oranges” so to speak.   Start by listing unique customer types, which include:

  • Sole Trader
  • Credit Union
  • Collective Investment Scheme
  • Trustee of a Collective Investment Scheme
  • Operator of a Collective Investment Scheme

Having done that, you can begin the task of finding out whether you can currently identify these customer types within your data.

The above is just a starting point, I hope it helps.

Feedback welcome, as always.

Eligibility Business Rule – Gobbledygook!

About this time last year, I started a discussion about business rules.

Recent experience has prompted me to re-open the discussion, as UK deposit takers seek to address a regulatory compliance requirement, namely to deliver a Single Customer View.

In the UK,  if a bank fails (goes bust), eligible deposit holders are guaranteed their money back, up to a limit of £50,000.   The term “eligible” is critical.  The UK Financial Services Compensation Scheme (UK FSCS) requires UK deposit takers to build a Single Customer View (SCV), that  identifies “eligible” deposit holders, and calculates their compensation entitlement.

UK FSCS Single Customer View Eligibility Rules - Top

Deposit Takers must “flag eligible accounts”, and provide “An explanation of any code or keys used internally by the deposit taker, so that the FSCS can easily identify which accounts are held by eligible claimants”.

The above sounds reasonable… So…. What is  the UK FSCS business rule for determining “eligibility”?

In layman’s terms, personal customers (individuals) and small businesses are eligible for compensation.
I hope you’re still with me – because it gets crazy from here, as we attempt to find out the exact rules for eligibility.

I include a screenshot of just some of the rules as they were on Sep 6th 2010 (click to enlarge and read).  Alternatively you may view the rules as they are today at: http://fsahandbook.info/FSA/html/handbook/COMP/4/2

I’ve worked in Data Management for almost 30 years, and I have seldom seen such gobbledygook.  Hundreds of deposit taking firms are subject to this regulation, which they must implement by January 2011. Each one must wade through this gobbledygook, seeking to extract clear, measurable and testable business rules, capable of being implemented. This is an example of bureaucracy gone mad.

What  you think?

Solvency II mandates Data Governance

Welcome to part 3 of Solvency II Standards for Data Quality – common sense standards for all businesses.

Regardless of the industry you work in, you make critical business decisions based on the information available to you.  You would like to believe the information is accurate.  I suggest the CEIOPS’ standards for “Accuracy”apply to your business, and your industry, just as much as they apply to the insurance industry.  I would welcome your feedback…

The CEIOPS (now renamed EIOPA) advice makes it clear that Solvency II requires you to have Data Governance in place (which CEIOPS / EIOPA refers to as “internal systems and procedures”).   The following sections of the document make this clear:

3.32 In order to ensure on a continuous basis a sufficient quality of the data used in the valuation of technical provisions, the undertaking should have in place internal systems and procedures covering the following areas:

• Data quality management;

• Internal processes on the identification, collection, and processing of data; and

• The role of internal/external auditors and the actuarial function.

3.1.4.1 Data quality management – Internal processes

3.33 Data quality management is a continuous process that should comprise the following steps:

a) Definition of the data;

b) Assessment of the quality of data;

c) Resolution of the material problems identified;

d) Monitoring data quality.

I will explore the above further in my next post.  Meanwhile, what Data Quality Management processes do you have in place?  Do you suffer from common Enterprise-Wide Data Governance Issues?

How to deliver a Single Customer View

How to deliver a Single Customer View

How to cost effectively deliver a Single Customer View

Many have tried, and many have failed to deliver a “Single Customer View”.  Well now it’s a regulatory requirement – at least for UK Deposit Takers (Banks, Building Societies, etc.).

The requirement to deliver a Single Customer View of eligible deposit holders indirectly affects every man, woman and child in the UK.  Their deposits, large or small, are covered by the UK Deposit Guarantee Scheme.  This scheme played a key role in maintaining confidence in the banking system during the dark days of the world financial crisis.

UK Deposit Takers must not only deliver the required Single Customer View data, they must provide clear evidence of the data quality processes and controls they use to deliver and verify the SCV data.

The deadline for compliance is challenging.  Plans must be submitted to the regulator by July 2010, and the SCV must be built and verified by Jan 2011.

To help UK Deposit Takers, I have written an E-book explaining how to cost effectively deliver a Single Customer View.  You may download this free from the Dataqualitypro website:

While the document specifically addresses the UK Financial Services Requirement for a Single Customer View, the process steps will help anyone planning a major data migration / data population project.

If you are in any doubt about the need for good data quality management processes to deliver any new system (e.g. Single Customer View, Solvency II, etc.), read the excellent Phil Simon interview on Dataqualitypro about why new systems fail.

Show me your Data Quality

Recently Marty Moseley explored how much data governance you should have in a thought provoking post called how taxing is your data governance.

I added the following comment “I agree with you – lightweight governance is the way to go – as you say “just formalized enough” Data Governance  framework, creating “good enough” deliverables to record your decisions, alternatives, precedents, drivers, policies, procedures/processes, business rules, enforcements and metrics – and find them later when you need to invariably make changes – OR WHEN THE REGULATOR asks to see your audit trail.

The fact is that regulators increasingly require evidence of the data quality of the underlying data on which you base your regulatory submissions.  Not just that they require evidence of your data quality management processes, i.e. your Data Governance.

The law now requires evidence of your data quality

For example, in the UK, Deposit Holders are required to build a Single Customer View (SCV) of depositors (UK FSA Single Customer View requirement)  They must not only build an SCV, the regulation states “We… would need to verify that the SCV information being collated and stored by a deposit taker was adequate and fit for purpose”.   Consultation Paper CP 09/3 about the SCV states “As part of their data cleansing, deposit-taking firms would be required to ensure the existence, completeness and accuracy of all data required for each depositor to facilitate fast payout”

A second example affects Insurance companies providing insurance in Europe (this also affects US Insurers operating in Europe).  EU Directive 2009/138/EC (Solvency II) – Article 82 states: “Member States shall ensure that insurance and reinsurance undertakings have internal processes and procedures in place to ensure the appropriateness, completeness and accuracy of the data used in the calculation of their technical provisions.” In other words – the regulator in each member state is required to review and approve the Data Governance processes of insurers.

Failure to comply with Regulatory requirements can prove expensive. Ask Standard Life.  It cost them over £100 million.  I presented details of the Standard Life case at the Information and Data Quality Seminar Series 2010 – Dublin. (For more details, see slides 11 and 12 in my presentation “Achieving Regulatory Compliance – The devil is in the data“.)

Shortly, working together with DataqualityPro, I will publish an E-Book on how to cost-effectively satisfy the (UK FSA Single Customer View requirement).  UK Deposit Holders must submit their SCV implementation plans to the FSA by the end of July 2010, and must complete their SCV by Jan 2011.  Time is running short.

Are you aware of other laws requiring evidence of Data Governance and Data Quality? If so, please share them.

Common Enterprise wide Data Governance Issues – #14. No Enterprise wide Data Model

I was reading David Loshin’s excellent post How Do You Know What Data is Master Data? and I thought “I know – I’ve covered that question in my blog” – but I hadn’t.  So here it is.

Your “Enterprise Wide Data Model” tells you what data is Master Data.

Unfortunately, most organisations lack an Enterprise Wide Data Model. Worse still, there is often little appreciation among senior management of the need for an Enterprise wide Data Model.

Impact:
The absence of a Enterprise wide Data Model makes it difficult for even technical experts to locate data.  The data model would distinguish between Master data and replicas, and would clarify whether the data in the model is currently in place, or planned for.  Without an Enterprise Wide Data Model, data dependent projects (e.g. BASEL II, Anti Money Laundering, Solvency II) must locate data (especially Master Data) from first principles, and face the risk of not finding the data, or identifying inappropriate sources.   New projects dependent on existing data take longer than necessary to complete, and face serious risk of failure.

Solution:
The CIO should define and implement the following Data policy:

An Enterprise wide Data Model will be developed covering critical Enterprise wide data, in accordance with industry best practice.

Time to sing from the same hymn sheet

One notable exception to the norm:
This is not a plug for IBM…. merely an observation based on my experience.

I worked in an IBM development lab in Dublin during the 90’s. At that time IBM developed a “Financial Services Data Model” (FSDM). Dublin was IBM’s “FSDM centre of excellence”. BASEL II turned FSDM into an “Overnight success”- TEN YEARS after it was developed. Organisations that had adopted IBM’s FSDM found it relatively easy to locate the data required by their BASEL II compliance programme.

I forsee a future in which all financial services organisations will use the same data model, including Financial Regulator(s).  “Singing from the same hymn sheet” will make communication far simpler, and less open to misinterpretation.

The lack of an Enterprise Wide Data Model is just one of the many data governance issues that affect organisations today.  Assess the status of this issue in your Enterprise by clicking here:  Data Governance Issue Assessment Process

Does your organisation have an “Enterprise wide Data Model” – if so, how did you achieve it?  Did you build it from scratch, or start with a vendor supplied model? Please share your experience.