The truth the whole truth and nothing but the truth

I enjoyed the good-natured contest (i.e., a blog-bout) between Henrik Liliendahl SørensenCharles Blyth and Jim Harris. The contest was a Blogging Olympics of sorts, with the Great Britain, United States and Denmark competing for the Gold, Silver, and Bronze medals in an event called “Three Single Versions of a Shared Version of the Truth.”

I read all three posts, and the excellent comments on them, and I then voted for Charles Blyth.  Here’s why:

I worked as a Systems Programmer in the ’80s in the airline industry. Remarkable as it sounds, system programmers in each airline used to modify the IBM supplied Operating System, which was known as the Airline Control Program (ACP), later renamed as Transaction Processing Facility (TPF).  Could you imagine each business in the world modifying Windows today? I’m not talking about a configuration change, I’m talking about an Assembler code change to the internals of the operating system. The late ‘80s saw the development of Global “Computer Reservations Systems” (CRS systems) including AMADEUS and GALILEO.  I moved from Aer Lingus, a small Irish airline, to work in London on the British Airways systems, to enable the British Airways systems share information and communicate with the new Global CRS systems. I learnt very important lessons during those years.

  1. The criticality of standards
  2. The drive for interoperability of systems
  3. The drive towards information sharing
  4. The drive away from bespoke development

What has the above got to do with MDM and the single version of the truth?

In the 70’s and 80’s, each airline was “re-inventing” the wheel by taking the base IBM Operating System, and then changing it.  Each airline started with the same base, but then Darwin’s theory of evolution kicked in (as it always does in Bespoke development environments).  This worked fine as long as each airline effectively worked in a standalone manner, and connecting flights required passengers to re-check in with a different airline etc.  This model was blown away with the arrival of Global CRS systems. Interconnectivity of airline reservation systems became critical, and this required all airlines to adhere to common standards.

We are still in the bespoke era of Master Data Management.  This will continue for some time.  Breaking out of this mode will require a major breakthrough.  The human genome project, in which DNA is being ‘deciphered’ is one of the finest examples of how the “open source” model can bring greater benefit to all.   The equivalent within the Data world could be the opening up of proprietary data models.  IBM developed the Financial Services Data Model (FSDM).   The FSDM became an ‘overnight success’ when BASEL II arrived.  Those Financial Institutions that had adopted the FSDM were in a position to find the data required by the Regulators relatively easily.

Imagine a world in which the Financial Regulator(s) used the same Data Model as the Financial Organisations?

Naturally, such a model would not be set in stone.  There would be incremental improvements – with new versions published on a regular (perhaps  yearly, biannually, or maybe every 5 years).

Back to the great “Truth” debate.  Charle’s view most closely aligns with mine, and I particularly liked his reference to granularity – keep going till one reaches the lowest granularity required – remember, one can always summarise up, but one cannot take apart what has been summarised up.

Most importantly, I would like to thank Henrik Liliendahl SørensenCharles Blyth and Jim Harris for holding this debate.  Debates like this signal the beginning of the move towards a “Single Version of the Truth” – and has already led to a major step forward.  Dean Groves suggested on Henrik’s blog that the word “version” be changed to “vision” – Suddenly, we had agreement – We all aspire to a “Single VISION of the Truth”.

I look forward to many more debates of this nature.

You can see the results of the vote here

Plug and Play Data – The future for Data Quality

The excellent IAIDQ World Quality Day webinar looked at what the Data Quality landscape might be like in 5 years time, in 2014.  This got me thinking.  Dylan Jones excellent article on The perils of procrastination made me think some more…

Plug and Play Data

Plug and Play Data

I believe that we data quality professionals need a paradigm shift in the way we think about data.  We need to make “Get data right first time” and  “Data Quality By Design” such no brainers that procrastination is not an option.   We need to promote a vision of the future in which all data is reusable and interchangeable – a world of “Plug and Play Data”.

Everybody, even senior business management, understand the concepts of “plug and play” and reusable play blocks.  For “plug and play” to succeed, interconnecting parts must be complete, fully moulded, and conform to clearly defined standards.  Hence “plug and play data” must be complete, fully populated, and conform to clearly defined standards (business rules).

How can organisations “get it right first time” and create “plug and play data”?
It is now relatively simple to invoke cloud based verification from any part of a system through which data enters.

For example, when opening a new “Student” bank account, cloud based verification might prompt the bank assistant with a message like “Mr. Jones’ date of birth suggests he is 48 years old.  Is his date of birth correct?  Is a “Student Account” appropriate for Mr. Jones”?

In conclusion:

We Data Quality Professionals need to educate both Business and IT on the need for, and the benefits of “plug and play data”.   We need to explain to senior management that data is no longer needed or used by only one application.  We need to explain that even tactical solutions within Lines of Business need to consider Enterprise demands for data such as:

  1. Data feed into regulatory systems (e.g Anti Money Laundering, BASEL II, Solvency II)
  2. Access from or data feed into CRM system
  3. Access from or data feed into Business Intelligence system
  4. Ad hoc provision of data to satisfy regulatory requests
  5. Increasingly – feeds to and from other organisations in the supply chain
  6. Ultimate replacement of application with newer generation system

We must educate the business on the increasingly dynamic information requirements of the Enterprise – which can only be satisfied by getting data “right first time” and by creating “plug and play data” that can be easily reused and interconnected.

What do you think?

Common Enterprise wide Data Governance Issues #11: No ownership of Cross Business Unit business rules

This post is one of a series dealing with common Enterprise Wide Data Governance Issues.  Assess the status of this issue in your Enterprise by clicking here:  Data Governance Issue Assessment Process

Business Units often disagree

I'm right, he's wrong!

Different Business Units sometimes use different business rules to perform the same task.

Withing retail banking for example, Business Unit A might use “Account Type” to distinguish personal accounts from business accounts, while Business Unit B might use “Account Fee Rate”.


Impact(s) can include:

  1. Undercharging of Business Accounts mistakenly identified as Personal Accounts, resulting in loss of revenue.
  2. Overcharging of Personal Accounts mistakenly identified as Business Accounts, which could lead to a fine or other sanctions from the Financial Regulator.
  3. Anti Money Laundering (AML) system generates false alerts on Business Accounts mistakenly identified as Personal Accounts.
  4. AML system fails to generate alert on suspicious activity (e.g. large cash lodgements) on a personal account misidentified as a Business Account, which could lead to a regulatory fine.
  5. Projects dependent on existing data (e.g. AML, CRM, BI) discover that the business rules they require are inconsistent.

Solution:
Agree and implement the following Policy:  (in addition to the policies listed for Data Governance Issue #10)

  • Responsibility for resolving cross business unit business rule discrepancies lies with the Enterprise Data Architect.

For further details on Business rules – see Business Rules Case Study.

Your experience:
Have you faced a situation in which different business units use different business rules?   Please share your experience by posting a comment – Thank you – Ken.