Common Enterprise Wide Data Governance Issues: #5 There is little understanding of what “Data Quality” means

This post is one of a series dealing with common Enterprise Wide Data Governance Issues.  Assess the status of this issue in your Enterprise by clicking here:  Data Governance Issue Assessment Process

When asked what does ‘Data Quality’ mean, senior management respond along the lines of ‘the data is either good (accurate) or bad (inaccurate)’.  There is little understanding of the commonly used dimensions of data quality.

  • Completeness
    Is the data populated ?
  • Validity
    Is the data within the permitted range of values ?
  • Accuracy
    Does the data represent reality or a verifiable source ?
  • Consistency
    Is the same data consistent across different files/tables ?
  • Timeliness
    Is the data available when needed ?
  • Accessibility
    Is the data easily accessible, understandable and usable ?

Impact: Without a shared understanding of what “Data Quality” means:

  • It is practically impossible to have a meaningful discussion about the existing and required data quality within an Enterprise.
  • Senior management are not in a position to request specific Data Quality metrics, and if you don’t measure, you can’t manage.
  • Business users are not in a position to clearly state the level of data quality they require.

Agree and implement the following policy:

In discussing Data issues and requirements, data quality will be assessed using a standard set of quality dimensions across the Enterprise.

Your experience:
Have you faced the above issue in your organisation, or while working with clients?  What did you do to resolve it?  Please share your experience by posting a comment – Thank you – Ken.

In October 2013, following an 18 month consultative process, DAMA UK published a white paper explaining 6 primary data quality dimensions.

1. Completeness
2. Uniqueness
3. Timeliness
4. Validity
5. Accuracy
6. Consistency

For more details see my blog post, Major step forward in Data Quality Measurement.

14 thoughts on “Common Enterprise Wide Data Governance Issues: #5 There is little understanding of what “Data Quality” means

  1. Hi Ken;

    I respectfully disagree. The reason ‘data quality’ cannot currently be defined to point where it can be measured is because the concept of data is not universally defined. As long as people use ‘data’ and ‘information’ as interchangeable constructs, this confusion will exist.

    The ‘geometry’ of meaning represents data as a point, and information as a curve: a set of relationships between and among a set of points. Combinations of curves (and the positive space they create) synthesises knowledge. This is not a high abstract concept…the graphic you display on the right side of this page is an excellent illustration.

    What needs to happen, IMHO, is for information practitioners to deal with data on its own (atomic) level and to treat information as predictable combinations. Data quality – and by extension information quality – is much easier to understand, manage and measure from there.

  2. John,

    Thank you for your comment. It is great to hear a dissenting voice. However, I suspect we may be speaking at cross purposes.

    The title of this post should read “There is little understanding AMONG SENIOR MANAGEMENT of what data quality means”.

    I believe there are many levels of complexity regarding Data Quality. When we ‘information practitioners’ discuss “data quality”, we may need to take the complexity, and depth of understanding to the level you suggest.

    Before we get a chance to do this within a given Enterprise, we must first convince Senior Management of the need for a Data Quality (or Information Quality) programme. Too many senior business managers do not understand “data quality” or “information quality” beyond the concepts of ‘good’ or ‘bad’. We need to explain, in business terms, the potential benefits to the Enterprise of a Data Quality Programme. To do this we need senior business management to at least understand ‘the commonly used dimensions of data quality’.

    When Senior Management has approved and funded the Data Quality Programme, we information practitioners can take the granularity of the quality dimensions etc. to the required level.


  3. Hi Ken;

    Completely agree with you…on all counts. I misunderstood the nature of the post, but now I see what you were driving at. I think the more granular and integrated things become, the easier the argument might be to make in favour of data/information quality. I also believe that business executives need to see a rock-solid and extensible framework for making that happen.

    John O’

  4. Hi Ken,

    As those of us considered to be data quality experts cannot agree on what data quality means either, what chance is there for this term in the business world?

    We tend to get ourselves involved in the data versus information controversy, but I think we shouldn’t overlook the “quality” part of the phrase. You talk about dimensions of data quality. Wouldn’t it be easier and more useful for middle management to reduce the use of the term “data quality” and start using more easily understood and defined terms instead: Data Completeness, Data Validity, Data Accuracy, Data Consistency, Data Timeliness, Data Accessibility …?

    • Hi Graham,

      Thanks for joining the debate!
      I agree with your suggestion, it makes perfect sense. I have one caveat – that all six of the dimensions should be measured to give a ‘balanced’ measurement of “Data Quality”.


  5. Hi Ken

    Great topic for discussion.

    I think one of the big issues is that as a profession, we’re not doing enough to demonstrate the business impact of data quality.

    This came out again in the recent IAIDQ/UALR survey as a concern from the 200 or so professionals who were reviewed.

    So, with that in mind I think that any enterprise-wide dimensions must include some kind of performance metric that relates to something the business values.

    A big problem I have found is that senior management can often glaze over when presented with stats on completeness for example. When you link it to something that matters to them, they suddenly sit up.

    Another issue is that not all data is equal so there needs to be a weighting. For example, financial data needs to be timely, other data less so.

    I personally would prefer to see a scorecard which measures data against critical-to-quality data quality rules, this way each business unit could define what they need from the data which could indeed include the standard set of dimensions you mention but far more service oriented measures also.

    • Hi Dylan,

      You have hit the nail on the head – Business managers focus on the bottom line. They focus on the ‘business impact’ of each decision they make.

      Business managers will fund a new initiative based on ROI, whether the initiative is the launch of a new product or a Data Quality programme.

      Data Quality dimensions are the “What” – We need to provide Business managers with the “So What”.

      For example, telling the Head Of Compliance in a bank that “Customer Date of Birth” is only 90% populated is useless. Telling the Head of Compliance that the bank is at risk of being fined by the regulator because the bank cannot reliably identify customers under 18 years of age, and therefore cannot perform Anti-Money Laundering monitoring is a different matter.

      You are also right Dylan, that not all data is equal. In fact, sometimes the same data can be more critical to some business units than others. Taking the above date of birth example: The marketing department may be disappointed that they cannot reliably target 10% of customers by age – but it is not critical to them, whereas date of birth may be critical to the compliance department.

      I like your idea for a scorecard that would include the standard set of data quality dimensions, but more importantly, would include for each business unit that uses the data: details of the “Business Rules” used by that business unit. “Business rules” provide critical details about a data field, including the ‘business’ name of the field, the business purpose of the field, the values it may hold, the business meaning of each value, and interdependencies with other data. In my experience, getting each business unit to document the “business rules” they apply highlights discrepancies between how business units use data.

      Thanks again for participating in the debate – Ken

  6. Agree with a lot of the article / comments. We have grappled not just with communications with senior managment but also with business people very close to data usage and issues with data quality. On a recent CDI project, after months of meaningless debate, we ended up defining a set of (40-odd) measurable metrics around customer data which worked well. Looking at them I’d suggest they would not be transferable completely to other companies or projects (apart from the intent perhaps).

  7. Hi Glenn,

    Thanks for your comment. You’ve touched on something important. It is relatively easy for us data professionals to talk in terms of what Steve Bennett refers to as “‘Classic’ data quality criteria. Things like accuracy, relevance, availability, etc.” in his excellent blog post (via @SmartDataCo:

    The challenge is to make this meaningful to business managers. Steve goes on to recommend that in addition to using the “Classic data quality criteria” we must also define “The impact data quality has on the business. Things like dollars lost if data is wrong, the value of increased sales if data is correct, etc.”

    You did as Steve recommends when you defined your set of 40-odd measurable metrics.

    The lesson we can take from your experience is to avoid months of meaningless debate by defining the ‘business impact’ earlier.


  8. Pingback: Process for assessing status of common Enterprise-Wide Data Issues « Ken O'Connor Data Consultant

  9. Pingback: What is your undertaking-wide common understanding of data quality? « Ken O'Connor Data Consultant

  10. Pingback: The dog and the frisbee and data quality management « Ken O'Connor Data Consultant

  11. Pingback: Major step forward in Data Quality Measurement | Ken O'Connor Data Consultant

  12. wow, good debate and comments. Getting old from 2009, but sadly still so relevant today in 2016. The industry hasn’t moved forward enough on Data Quality. Sad. I think Dylan hit the nail on the head with “commercial measures/impact”.. that’s where the rubber hit the road. If we can demonstrate value we’ll never be able to engage in the change (technical/cultural).

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s