The truth the whole truth and nothing but the truth

I enjoyed the good-natured contest (i.e., a blog-bout) between Henrik Liliendahl SørensenCharles Blyth and Jim Harris. The contest was a Blogging Olympics of sorts, with the Great Britain, United States and Denmark competing for the Gold, Silver, and Bronze medals in an event called “Three Single Versions of a Shared Version of the Truth.”

I read all three posts, and the excellent comments on them, and I then voted for Charles Blyth.  Here’s why:

I worked as a Systems Programmer in the ’80s in the airline industry. Remarkable as it sounds, system programmers in each airline used to modify the IBM supplied Operating System, which was known as the Airline Control Program (ACP), later renamed as Transaction Processing Facility (TPF).  Could you imagine each business in the world modifying Windows today? I’m not talking about a configuration change, I’m talking about an Assembler code change to the internals of the operating system. The late ‘80s saw the development of Global “Computer Reservations Systems” (CRS systems) including AMADEUS and GALILEO.  I moved from Aer Lingus, a small Irish airline, to work in London on the British Airways systems, to enable the British Airways systems share information and communicate with the new Global CRS systems. I learnt very important lessons during those years.

  1. The criticality of standards
  2. The drive for interoperability of systems
  3. The drive towards information sharing
  4. The drive away from bespoke development

What has the above got to do with MDM and the single version of the truth?

In the 70’s and 80’s, each airline was “re-inventing” the wheel by taking the base IBM Operating System, and then changing it.  Each airline started with the same base, but then Darwin’s theory of evolution kicked in (as it always does in Bespoke development environments).  This worked fine as long as each airline effectively worked in a standalone manner, and connecting flights required passengers to re-check in with a different airline etc.  This model was blown away with the arrival of Global CRS systems. Interconnectivity of airline reservation systems became critical, and this required all airlines to adhere to common standards.

We are still in the bespoke era of Master Data Management.  This will continue for some time.  Breaking out of this mode will require a major breakthrough.  The human genome project, in which DNA is being ‘deciphered’ is one of the finest examples of how the “open source” model can bring greater benefit to all.   The equivalent within the Data world could be the opening up of proprietary data models.  IBM developed the Financial Services Data Model (FSDM).   The FSDM became an ‘overnight success’ when BASEL II arrived.  Those Financial Institutions that had adopted the FSDM were in a position to find the data required by the Regulators relatively easily.

Imagine a world in which the Financial Regulator(s) used the same Data Model as the Financial Organisations?

Naturally, such a model would not be set in stone.  There would be incremental improvements – with new versions published on a regular (perhaps  yearly, biannually, or maybe every 5 years).

Back to the great “Truth” debate.  Charle’s view most closely aligns with mine, and I particularly liked his reference to granularity – keep going till one reaches the lowest granularity required – remember, one can always summarise up, but one cannot take apart what has been summarised up.

Most importantly, I would like to thank Henrik Liliendahl SørensenCharles Blyth and Jim Harris for holding this debate.  Debates like this signal the beginning of the move towards a “Single Version of the Truth” – and has already led to a major step forward.  Dean Groves suggested on Henrik’s blog that the word “version” be changed to “vision” – Suddenly, we had agreement – We all aspire to a “Single VISION of the Truth”.

I look forward to many more debates of this nature.

You can see the results of the vote here

4 thoughts on “The truth the whole truth and nothing but the truth

  1. Hi Ken,

    I can ‘Imagine a world in which the Financial Regulator(s) used the same Data Model as the Financial Organisations?’ and an Open Source data model, great ideas here, something we should discussed in more detail

    Thanks for voting for me, I might steal this statement ‘one can always summarise up, but one cannot take apart what has been summarised up’. Granularity is the key.

  2. Great input Ken. Your title was actually an early working title of mine. Should have kept it – perhaps I could have won your vote.

    Not re-inventing the wheel is so important. Not at least when it comes to Party Master Data (names and addresses) I always have had a problem about everyone constantly typing in data and maintaining data about everyone else. I think we will see a change here in the years to come.

    I also have had great benefits from utilizing industry data models – right now I work with such a model within public transportation.

  3. Great post Ken,

    I also liked the granularity aspect that Charles laid out in his post and definitely agree with you that not going down to the lowest grain from the very beginning will only cause major trouble later.

    I also like your remarks about how we are still in the bespoke era of MDM.

    Evolution (and our adventures) will only continue.

    Thanks and Best Regards,

    Jim

  4. As a Senior Consultant with Evaxyx, I found this piece fascinating and deeply relevant.
    At Evaxyx, we believe that information is at the heart of any modern enterprise, and that it must be used for business advantage. We always begin by constructing a model of the data used in an enterprise. Our models promote engagement over formality. Before any discussions on data can begin, it is essential that a common basis of understanding is achieved. There are always existing perspectives to accommodate. We do this by working collaboratively and intensely with our customers.

    Every customer we work with is different. Most have long histories of IT adoption, and many are leaders in their own domains. We systemise our approach using the Evaxyx Delivery Method, a knowledge management framework that allows us to capture, reflect and disseminate our diverse experiences. We seek to understand the methods used by our customers to create their data estates, and use this understanding to improve how they will exploit these estates to face the next generation of business challenges.

    We firmly believe that each business domain has its own unique data challenges, yet we have come to realise that there is much to be learned from cross-fertilising best practices across diverse business applications. We encourage our customers to use our wide-ranging experience, and learn from each other, to see how they can use their data in innovative and effective ways.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s