Applying “Lateral Thinking” to Data Quality

I am a fan of Edward De Bono, the originator of the concept of Lateral Thinking.   One of my favourite examples of De Bono’s brilliance, relates to dealing with the worldwide problem of river pollution.

Imagine if… the factory producing this sewage had to use this as its inflow…

De Bono suggested “each factory must be downstream of itself” – i.e. Require factories’ water inflow pipes to be just downstream of their outflow pipes.

Suddenly, the water quality in the outflow pipe becomes a lot more important to the factory.  Apparently several countries have implemented this idea as law.

What has this got to do with data quality?

By applying the same principle to data entry, all downstream data users will benefit, and information quality will improve.

How could this be done?

It is part of human nature for each of us to take pride in what we do, pride in the outputs we produce.  By measuring data quality at point of entry, and providing real-time feedback to data entry personnel, it tells them that their work matters, and enables them to take greater pride in their work.

I believe the above can help solve  Common Enterprise Wide Data Governance Issue: #2 The quality of data entered by front-end staff is not as high as desired.

I hope to revisit the “Data River” in future blog posts.  Meanwhile, put your “Lateral Thinking” cap on.  How may ideas from other spheres in life help organisations improve their Data Quality?

9 thoughts on “Applying “Lateral Thinking” to Data Quality

  1. Nice Ken, this will go well with “the flow of data” drawings I use in presentations, workshops and such…

    I hope you dont mind if I “repurpose” the outflow pipes are upstream from input 🙂

  2. Ken,

    This also fits nicely with a swimming pool analogy that I use. In my example, like yours, the data is represented by the water and the data quality as dirty water. I usually suggest that organisations address two key areas – firstly, ensure the water entering the pool from business processes is clean, secondly, use data profiling etc. to clean the water in the pool.

    Coupled with the above, an organisation needs to be aware that they are unlikely to get perfectly clean water/data, so should ensure that business decision making includes suitable controls to reflect actual water/data quality.

    Can’t wait for your next installment….


  3. Thank’s Garnie, Julian and Dylan for the feedback,

    As we data quality professionals know, the “behind the scenes processes” to get data from original source, into downstream MIS, CRM, Regulator reports, Marketing Material, etc. are often extremely costly, manual, labor-intensive, prone to error and wholly inefficient.

    Quite rightly, these processes are invisible to Senior Management. Senior Management simply assume “it is done”. I believe we need analogies, like this “river pollution” one to explain the need for “Data Governance”, to ensure “it is done correctly”.

    Julian, I like the swimming pool analogy. Next installment I hope to explore “suitable controls to reflect actual water/data quality.”


  4. Pingback: How tasty is your data quality cheese? « Data and Process Advantage Blog

  5. Pingback: Michael Baylon's blog

  6. Pingback: Recently Read: March 6, 2010 « Reblogger Memo Links

  7. Pingback: Applying “Lateral Thinking” to Data Quality « Another Word For It

  8. Pingback: Six Thinking Hats | westmarch

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s