Data is the new oil – what grade is yours?

Bill Bryson’s book “One Summer: America 1927” provides a fascinating insight into the world of Aviation in the “roaring 20’s”. Aviators were vying to be the first to cross the Atlantic from New York to Paris, a challenge that took many lives, most of which were European.  

Bryson tells us “The American flyers also had an advantage over their European counterparts that nobody yet understood. They all used aviation fuel from California, which burned more cleanly and gave better mileage. No one knew what made it superior because no one yet understood octane ratings – that would not come until the 1930s – but it was what got most American planes across the ocean while others were lost at sea.

Once octane ratings were understood, fuel quality was measured and lives were saved.

We’ve all heard that data is the new oil. To benefit from this “new oil”, you must ensure you use “top grade” only. It can make the difference between business success and failure. It is also a prerequisite for Regulatory compliance, (Solvency II, FATCA, Dodd Frank, Basel III, BCBS 239 etc.). Thankfully, like octane ratings, we know how to measure data quality using 6 primary dimensions: completeness; validity; accuracy; uniqueness; timeliness and consistency. For more details see my post: Major step forward in Data Quality Measurement.

I also explore this topic in my post Russian Gas Pipe and Data Governance.

What happens in your organisation? Do you measure the quality of your most critical data, or do you fly on a wing and a prayer? Please add your comments below.

Know your data

You must know your data.

Do you know what’s in your data box of chocolates?

You must know where it is, what it should contain and what it actually contains.

When your data does not contain what it should, you must have a process for correcting it.

CEOs, CFOs and CROs often take the above as “given”.  They make business critical decisions using information derived from data within their organisation.  After all, its applied common sense.

For the insurance industry, Solvency II requires evidence that you are applying common sense.

If you operate in the EU market or process the personal data of EU data subjects, you must comply with the EU General Data Protection Regulation (GDPR) or face severe fines. To comply, you must “know your (personal) data” and how you manage it.

In my experience, data is like a box of chocolates “You never know what you’re gonna get.”

Do you know your data?

How to deliver a Single Customer View

How to deliver a Single Customer View

How to cost effectively deliver a Single Customer View

Many have tried, and many have failed to deliver a “Single Customer View”.  Well now it’s a regulatory requirement – at least for UK Deposit Takers (Banks, Building Societies, etc.).

The requirement to deliver a Single Customer View of eligible deposit holders indirectly affects every man, woman and child in the UK.  Their deposits, large or small, are covered by the UK Deposit Guarantee Scheme.  This scheme played a key role in maintaining confidence in the banking system during the dark days of the world financial crisis.

UK Deposit Takers must not only deliver the required Single Customer View data, they must provide clear evidence of the data quality processes and controls they use to deliver and verify the SCV data.

The deadline for compliance is challenging.  Plans must be submitted to the regulator by July 2010, and the SCV must be built and verified by Jan 2011.

To help UK Deposit Takers, I have written an E-book explaining how to cost effectively deliver a Single Customer View.  You may download this free from the Dataqualitypro website:

While the document specifically addresses the UK Financial Services Requirement for a Single Customer View, the process steps will help anyone planning a major data migration / data population project.

If you are in any doubt about the need for good data quality management processes to deliver any new system (e.g. Single Customer View, Solvency II, etc.), read the excellent Phil Simon interview on Dataqualitypro about why new systems fail.

My interview with Dylan Jones

Dylan Jones of DataQualityPro interviews me about the process I use to assess common Enterprise wide data issues. Use this process to assess the status of data governance within your organisation or that of a client.

Data Quality Pro interview with Ken O'Connor Data Consultant

Russian Gas Pipe and Data Governance

As you know, Russia supplies Gas to many European countries.

What's flowing through your critical data pipelines?

Do you know what’s in your critical data pipelines?

Could you imagine Italy purchasing gas from Russia without checking what exactly was flowing through the pipe?  I’m no expert on gas pipelines, but I know that before completing the agreement to purchase the gas, Italy and Russia would have agreed metrics such as:

  • Volume of Gas
  • Calorific value (Energy content)
  • etc.

So what? What else would one expect?  Applied common sense… yes?

Why is it that such common sense is often lacking in Data Migration and Data Population projects?  Why do some Enterprises continue to perform data population of, and ongoing data entry to, critical data repositories without fully understanding the data they are pumping into the repository?

A simple example involves Date of Birth.  The business ask the IT function to populate Date of Birth in the new AML / BASEL II / CRM / other repository. Some time later, when data population is complete, the business begin to express concerns:

  • “We never realised we had so many customers aged over 100 ???”
  • “I thought we had more Student customers”
  • “How come so many of our customers share the same birthday ?”
  • “These are not the results we expected”
  • etc.

Performing data population on the basis of what the source data “should contain”, without analysing what exactly it does contain is known as ‘Load and Explode’ approach to Data Population.  I cover this Enterprise Wide Data Issue in more detail here.

We in the “Data Governance”, “Data Quality” industry need to educate the business community on the “common sense” parts of data governance, and the need to engage “Data Governance Professionals”  to ensure that “Data Quality Common Sense” is actually applied.

Feedback welcome – Ken

Business Rules Case Study – Part II

In part one of this case study,  I  discussed questions like:

  1. Why are Business Rules necessary?
  2. What exactly is a Business Rule?
  3. What should happen if the data fails a Business Rule?

I would like to thank the following people for contributing to the discussion to date:

Jim Harris @ocdqblog shared his experience on data migration and data integration projects, and concluded “Sadly, the most common problem was that no business rules were defined at all and the data would be blindly migrated or integrated without even at least some superficial validation checks.” more here.

In Henrik Liliendahl Sørensen’s @hlsdk experience, Business rules divide into External and Internal Business Rules:

  • “External rules that are defined outside your organisation – mostly laws and other regulations you must follow when doing business in a given country (or group of countries like the EU).
  • Internal rules that are defined by and for your business alone – made to make your business competitive.” more here.

Marianne Colwell @emx5 shared recent wins on the project she is currently working on, in which they have captured business rules in a requirements management repository, more here.

Phil Allen would like to know what the most popular choices of software are for handling the recording of Business Rules and what experiences people have had more here.

In part two, I plan to explore:

I will continue to use a case study from an Anti Money Laundering (AML) programme.  However, in my experience, all data migration / data population projects face the same challenges.

What controls should you have in place to manage Business Rules?

In Sarbannes Oxley (SOX) terms: “If it’s not written down, it doesn’t exist”.  In my experience, you need the following controls to manage business rules:

  1. Business owner (Business responsibility)
    There must be a defined business owner (business area) with responsibility for the data item, and for the business rule(s) relating to it.  The definition must include details of who to contact (the title of a person) with queries regarding the data.
  2. Location of Business rule(s) (Business responsibility)
    The Business owner must identify where the Master business rules are formally documented, and subject to Change Management.  The business owner must also identify where copies of the business rules are held, since they must all be updated when the master copy is updated.
  3. Change Management process for the Master business rules, and copies. (Business responsibility).
    The Business owner must have a documented Change Management process for updates to the Master business rules, and for ensuring that all copies of the business rules are also updated.
  4. Location of source data (Business accountability – Technical responsibility)
    The Business owner must satisfy him/herself that the providers of IT services to the business have a control process in place that identifies where the actual data is held (i.e. the physical location).  If there are a number of physical locations, they should all be recorded, together with details of which is the Master source, which is a replica, and details of the replication process.

Where should you look for Business Rules (if your Enterprise has no Master Business Rules Repository)

Too often, I have worked on data migration/population projects for which there was no master business rules repository.  We had to research the business rules from first principles. If you have to research business rules from first principles, I suggest you consider the following locations.

  • Business Operations Manuals
    Most organisations have some form of operations manuals – in hard or softcopy.  Business rules are commonly embedded in this documentation.  Be careful, they are often out of date.
  • Computer System prompt screens / help screens
    The possible/permitted values for a given field are often provided on help screens.
  • Internet sites belonging to the Enterprise
    Internal and external websites are a rich source of business rules.  They can hold product details, fee rates, etc.
    Unfortunately, they are too often out of sync with the Master Business Rules (wherever they are).
  • Data Warehouse(s) within Enterprise
    If you are lucky enough to have a single “Enterprise Data Warehouse”, this is the logical place to find business rules.  In my experience, many enterprises have a number of data bases (often in the Marketing area), at least one of which is referred to as a ‘data warehouse’.
  • Data Protection Area
    In most countries, customers may request details of the data held about them by an Enterprise.  Many Enterprises have a “Data Protection Area” to coordinate gathering the details held about the customer.  Often, the details held contain internal codes, which the Data Protection Area must ‘translate’ into something meaningful for the customer.  In my experience, the “Data Protection Area” translation process is a rich source of Business Rules.
  • Business Rules are often coded into application systems such as:
    • Anti Money Laundering (AML)
    • BASEL II
    • CRM
    • Single view of customer database

The above are all potential sources of Business Rules…however, they share a common characteristic – they are all typically ‘copies’ or replicas of the master business rules.   My experience suggests the following (I look forward to reading your feedback on this ):

  • The ‘Master Copy’ should be the copy used by the production application system (e.g. to apply an interest rate, e.g. to calculate fees due).
    Rationale:
    – The production application system copy dictates the customer experience (e.g. interest rate charged or given).
    – Production ‘Master copies’ are already subject to ‘IT Change Management Processes’ that ensure all changes are authorised by the business, and tested prior to going live.
  • Unfortunately, many production ‘IT Change Management Processes’ do not attempt to identify ‘replica copies’ of the product information, and I believe this is a ‘Gap’ in the process.
  • I recommend that production ‘Change Management Processes’ should be extended as follows:
    • Replica copies of business rules must be identified, together with the business owners of the replica copies.
      (This can be a once-off process).
    • The Business area requesting and authorising a change must contact the business owner of each replica copy, and receive confirmation that the proposed change is understood and accepted.
    • The change to the production ‘Master Copy’ must be synchronised with the change to all ‘replica copies’. e.g. If the interest rate on a product is changed from 3% to 4% – The product information on a website must change at the same time that the rate is changed (probably within 24 hours).
    • Copy ‘owners’ should also perform a periodic control; every 6 or 12 months, to verify that changes made to the ‘production master’ have been reflected in their replica copies.
      (The copy owners require a means of displaying both the master and copy details).

What has all of the above got to do with an AML programme?

My most recent encounter with researching business rules from first principles was on an AML programme.  An AML programme is an “End of food-chain” programme, as are most Data Migration and Data Population programmes like Euro Changeover, Basel II, CRM and Single View of Customer programmes.

End of food-chain programmes share the following characteristics:

  • They depend on pre-existing data
  • They have no control over the quality of existing data they depend on
  • They have no control over the data entry processes by which the data they require is captured.
  • The data they require may have been captured many years previously.

    What has your experience been?  Have you identified other places to look for business rules? Please share your experience by posting a comment.   Thank you, Ken.

    Common Enterprise wide Data Governance Issues #10: No ‘Master’ repository of business rules

    This post is one of a series dealing with common Enterprise Wide Data Governance Issues.  Assess the status of this issue in your Enterprise by clicking here: Data Governance Issue Assessment Process

    Business rules provide critical details about data fields, including the ‘business’ name of the field, the business purpose of the field, the values it may hold, the business meaning of each value, and interdependencies with other data.

    An example of a business rule could be ‘Account Type must be consistent with Account Fee Rate, both must be BUSINESS, or both must be ‘PERSONAL’.  Such a business rule would be critical on an Anti Money Laundering Programme, where you must apply different alert rules to personal and business accounts.

    In some organisations, there is no ‘Master’ repository of business rules.  Business rules are not easily researched, not formally documented, and not subject to change control.

    Impact: Projects dependent on existing data must research business rules governing that data from first principles, and face the risk of not finding them, or finding inconsistent business rules.  This leads to duplication of effort, project delays, and the risk of making incorrect business decisions based on incorrect business rules (e.g. generating False Anti Money Laundering Alerts on accounts you treat as PERSONAL, when in fact they are BUSINESS.)

    Solution:
    Agree and implement the following Policies:

    1. Overall ownership for business rules governing data within the Enterprise lies with the CIO.
    2. Ownership for business rules within each Business Unit lies with the CIO and the head of the Business Unit.
    3. Business rules must be formally documented and subject to change control (Enterprise-wide, and Business Unit specific).
    4. The CIO must appoint a person (TitleX) with responsibility for Enterprise wide business rules.
    5. TitleX is responsible for the definition and maintenance of Enterprise-wide business rules, in consultation with business units.
    6. TitleX must provide a single point of contact to handle requests for business rule details.

    Your experience:
    Have you faced the above issue in your organisation, or while working with clients?  What did you do to resolve it?  Please share your experience by posting a comment – Thank you – Ken.