Plug and Play Data – The future for Data Quality

The excellent IAIDQ World Quality Day webinar looked at what the Data Quality landscape might be like in 5 years time, in 2014.  This got me thinking.  Dylan Jones excellent article on The perils of procrastination made me think some more…

Plug and Play Data

Plug and Play Data

I believe that we data quality professionals need a paradigm shift in the way we think about data.  We need to make “Get data right first time” and  “Data Quality By Design” such no brainers that procrastination is not an option.   We need to promote a vision of the future in which all data is reusable and interchangeable – a world of “Plug and Play Data”.

Everybody, even senior business management, understand the concepts of “plug and play” and reusable play blocks.  For “plug and play” to succeed, interconnecting parts must be complete, fully moulded, and conform to clearly defined standards.  Hence “plug and play data” must be complete, fully populated, and conform to clearly defined standards (business rules).

How can organisations “get it right first time” and create “plug and play data”?
It is now relatively simple to invoke cloud based verification from any part of a system through which data enters.

For example, when opening a new “Student” bank account, cloud based verification might prompt the bank assistant with a message like “Mr. Jones’ date of birth suggests he is 48 years old.  Is his date of birth correct?  Is a “Student Account” appropriate for Mr. Jones”?

In conclusion:

We Data Quality Professionals need to educate both Business and IT on the need for, and the benefits of “plug and play data”.   We need to explain to senior management that data is no longer needed or used by only one application.  We need to explain that even tactical solutions within Lines of Business need to consider Enterprise demands for data such as:

  1. Data feed into regulatory systems (e.g Anti Money Laundering, BASEL II, Solvency II)
  2. Access from or data feed into CRM system
  3. Access from or data feed into Business Intelligence system
  4. Ad hoc provision of data to satisfy regulatory requests
  5. Increasingly – feeds to and from other organisations in the supply chain
  6. Ultimate replacement of application with newer generation system

We must educate the business on the increasingly dynamic information requirements of the Enterprise – which can only be satisfied by getting data “right first time” and by creating “plug and play data” that can be easily reused and interconnected.

What do you think?

2 thoughts on “Plug and Play Data – The future for Data Quality

  1. Absolutely! Good post.

    This is especially important when 1/3 of all enterprise data originates outside the enterprise. Do you trust the entity sending it to you? Do you audit their processes to make sure they have quality data? Even if you do, do you want your systems to be corrupted by or your decisions to be made based on bad data?

    I like the idea of, and am a strong supporter, of cultural change in this area. However, I think the difficulty is in how we – in a multi-enterprise dependent world – get EVERYONE to define their data in the same way. Very tough in a closed loop within your four walls. What about in an open system?

    Figuring this out should be fun!

  2. Pingback: Semantic web and data quality « Ken O'Connor Data Consultant

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s