Incomplete loan data puts €8.2billion at risk

Ireland’s leading business newspaper, The Sunday Business Post, reported on 13th Nov 2011 that incomplete loan documentation data could complicate banks’ ability to take security on €8.2billion worth of loans, in the event of a default.  (Click here to see the full article).

Data quality measurement can detect incomplete data

Central bank researchers discovered incomplete data in 78,000 of 688,000 loans surveyed. The researchers were producing a paper for a conference on the Irish mortgage market on October 13 2011. They found 10,094 loans lacked a property identifier, 35,044 had no initial valuation, 15,413 had no valuation date, and 18,628 specified no geographic data.

Similar issues with bad loan data led to greater haircuts for the banks when the National Asset Management Agency (NAMA) transferred billions in assets in 2009 and 2010.  In the US, banks have been stopped from pursuing delinquent borrowers where loan data was incomplete or missing.

How could such a situation arise?  How can similar problems be prevented?

Front line staff are often under pressure to complete a sale, and “sort out the details later” (previously discussed here) .  Hence even the most robust and vigorous data validation processes often provide a “bypass” facility. This is normal business practice, and perfectly acceptable.   In many instances, critical documentation for a loan (or other product), may not be available at the time of data entry. Problems only arise if no one goes back to “sort out the details later”.  One or two loans with incomplete data may not pose a major risk, – but incomplete data in 10% of a loan book spells serious trouble.

Common sense data quality management steps can prevent similar problems arising in your organisation. Data validation alone is insufficient.  Data quality measurement, and on-going data quality monitoring is required.

In the case study reported above, central bank researchers used data quality measurement to detect the incomplete loan data.  Similar data quality measurement can and should be incorporated into all business critical systems.  Regular monitoring could generate an alert when the % of loans with incomplete data exceeds a threshold – say 2%.  Alternatively, monitoring could generate an alert when the time limit for “sorting the details out later” has been exceeded.

This case study highlights the difference between data validation and data quality measurement.  I will deal with this topic in my next post.

Feedback, as always, most welcome.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s