FSA imposes £2.4million fine for inadequate risk reporting systems

London 18th March 2013 – FSA imposes £2.4million fine for inadequate risk reporting systems, which led to a failure to keep investors informed ahead of a profit warning which wiped 57% off the company’s share price. (See London Evening Standard: “Watchdog gets tougher as oil-rig firm Lamprell is fined £2.4 million over stock market breach“).

Oil services group Lamprell is not a bank. However, Lamprell could have avoided this fine, if they had implemented the new BCBS principles for effective risk data aggregation and risk reporting practices (BCBS 239), as published in January 2013; principles, which I describe in a previous post as Data aggregation and reporting principles – applied common sense

I include below some quotes from the article, and in parentheses, the relevant text from the BCBS 239 principles:

  • The FSA said that monthly reports to the board had been totally inadequate for a company of its size and that such reports were delivered late.”
    (Principle 5: Timeliness. Paragraph 44 “A bank’s risk data aggregation capabilities should ensure that it is able to produce aggregate risk information on a timely basis to meet all risk management reporting requirements.”)
  • “It also said the takeover of a rival in 2011, which doubled Lamprell’s size, had left the company using too many different reporting systems.”
    (Principle 1 Governance. Paragraph 29. A bank’s risk data aggregation capabilities and risk reporting practices should be… Considered as part of any new initiatives, including acquisitions and/or divestitures… When considering a material acquisition, a bank’s due diligence process should assess the risk data aggregation capabilities and risk reporting practices of the acquired entity, as well as the impact on its own risk data aggregation capabilities and risk reporting practices. The impact on risk data aggregation should be considered explicitly by the board and inform the decision to proceed. The bank should establish a timeframe to integrate and align the acquired risk data aggregation capabilities and risk reporting practices within its own framework.)

Tracey McDermott, FSA director of enforcement and financial crime, said: “Lamprell’s systems and controls may have been adequate at an earlier stage, but failed to keep pace with its growth. As a result they were seriously deficient for a listed company of its size and complexity, meaning it was unable to update the market on crucial financial information in a timely manner.”

The moral of the story… ensure your organisation, regardless of your industry, applies the common sense set out in: “Data aggregation and reporting principles (BCBS 239) – applied common sense“.

The growing demand for food and data provenance

In November 2012, I presented at the Data Management and Information Quality Europe 2012 conference, in London. My presentation was called Do you know what’s in the data you’re consuming.

In the presentation, I compare the data supply chain with the food supply chain.

I believe that data consumers have the right to be provided with facts about the content of the data they are consuming, just as food consumers are provided with facts about the food they are buying. The presentation provides guidelines on how you can improve your data supply chain.

Little did I realise that within 3 months the term “provenance” would be hitting the headlines due to the European horsemeat scandal.

There’s a silver lining in this food scandal for data quality management professionals. As financial regulators increasingly demand evidence of the provenance of the data provided to them, it is now easier for data quality management professionals to explain to their business colleagues and senior management what “data provenance” means, and what it requires.  Retailers, such as Tesco, must have controls in their supply chain that ensure that the food they sell to consumers only contains “what it says on the tin”. Similarly, financial services organisations providing data to financial regulators must have controls in their data supply chain that ensure the quality of the data they provide can be trusted. Regulators are now asking financial services organisations to demonstrate evidence that their data supply chain can be trusted. They require organisations to demonstrate evidence of their data provenance, as applied to their critical or material data.

But what exactly is “data provenance”? The best definition I have seen comes from Michael Brackett in his excellent book “Data Resource Simplexity“.

“Data Provenance is provenance applied to the organisation’s data resource. The data provenance principle states that the source of data, how the data were captured, the meaning of the data when they were first captured, where the data were stored, the path of those data to the current location, how the data were moved along that path, and how those data were altered along that path must be documented to ensure the authenticity of those data and their appropriateness for supporting the business”.

Enjoy your “beef” burger!

Basel Committee issues “Principles for effective risk data aggregation and risk reporting (BCBS 239) – final document”

Today, 9th Jan 2013, the Basel Committee on Banking Supervision issued a press release announcing the final document “Principles for effective risk data aggregation and risk reporting (BCBS 239).

I wrote two posts on the consultation paper, when it was issued in June 2012.

1. Data aggregation and reporting principles (BCBS 239) – applied common sense

2. Risk data aggregation and risk reporting (BCBS 239) – Board and senior management responsibilities

I’m pleased to see that the final document retains the applied common sense approach set out in the June 2012 version. Working together with other independent data professionals, I provided feedback to the Basel Committee on the June consultation paper. Our feedback, together with that from banking organisations worldwide was also published today – See feedback and comments. The final document has taken on board the feedback from all of the contributors and provided additional clarification where requested. For example, Annex 1 has been expanded considerably to explain all of the “Terms used in the document”.

Banks need to start working on compliance plans immediately. To quote paragraph 87 of the document “National banking supervisors will start discussing implementation of the Principles with G-SIB’s senior management in early 2013. This will ensure that banks develop a strategy to meet the Principles by 2016.”

While compliance with the principles is mandatory for G-SIBS (Globally Systemically Important Banks), the document makes it clear that all banks should implement the principles. Paragraph 15 states “It is strongly suggested that national supervisors also apply these Principles to banks identified as D-SIBs (Domestic systemically important banks) by their national supervisors three years after their designation as D-SIBs.  

As I point out in Data aggregation and reporting principles (BCBS 239) – applied common sense, all organisations in all industries would benefit by applying these principles. Simply remove the word “Risk”, and one has “Principles for effective data aggregation and reporting”.

What do you think? Please post your views below. Join the debate.

Risk data aggregation and risk reporting (BCBS 239) – Board and senior management responsibilities

Post #2 in my series on Data aggregation and reporting principles (BCBS 239) – applied common sense

I was saddened to hear of the death on July 16th of Steven Covey, author of The Seven Habits of Highly Effective PeopleI have found the 7 habits very useful in my work as a data consultant.

Two of the habits apply directly to this blog post.

  • Habit 1: Be Proactive
  • Habit 2: Begin with the End in Mind

I imagine the authors of BCBS 239, “Principles for effective risk data aggregation and reporting principles” are also familiar with the 7 habits, since the principles appear to be based on them.

Habit 1: Be Proactive

Regulatory supervisors expect the board and senior management to “be proactive” in taking responsibility for risk data aggregation and risk reporting.  The following quotes from the document illustrate my point:

Section I. “Overarching governance and infrastructure”

Paragraph 20: “… In particular, a bank’s board and senior management should take ownership of implementing all the risk data aggregation and risk reporting principles and have a strategy to meet them within a timeframe agreed with their supervisors… by 2016 at the latest.”

Paragraph 21. “A bank’s board and senior management should promote the identification, assessment and management of data quality risks as part of its overall risk management framework…. A bank’s board and senior management should review and approve the bank’s group risk data aggregation and risk reporting and ensure that adequate resources are deployed.”

Habit 2: Begin with the End in Mind

I advise my clients to “Begin with the end in mind” – by defining clear, measurable and testable requirements.

The authors of the Basel principles appear to agree.  The board and senior management are the people who must assess the risks faced by the financial institution, therefore they are the people who must specify the information they want in the risk reports. Don’t take my word for it – the following quotes from the document illustrate my point:

Principle 9: Clarity

Paragraph 53. “As one of the key recipients of risk management reports, the bank’s board is responsible for determining its own risk reporting requirements.

Paragraph 55: “Senior management is one of the key recipients of risk reports and is also responsible for determining its own risk reporting requirements.”

What is the impact of the above? 

Regulators will expect to see evidence of documented risk reporting requirements, signed off by the board and senior management.

Where are yours?

Data aggregation and reporting principles – applied common sense

Principles for effective risk data aggregation and risk reporting

Basel Consultative Document
Data aggregation and reporting principles (BCBS 239)

Those of you familiar with my blog will know that I am a fan of common sense.

I believe that data quality management requires one to apply common sense principles and processes to your data.  I believe that the same common sense principles apply regardless of the industry you are in.

Your data will be unique, but the common sense questions you must ask yourself will be the same.  They include:

  • What MI reports do we need to run our business?
  • What critical data do we need in our MI reports?
  • Who owns and is responsible for gathering the critical data we need in our MI reports?
  • What should our critical data contain?
  • What metrics do we have to verify our critical data contains what it should?
  • etc…

Click on the image to see a document that lists what I regard as “common sense” data aggregation and reporting principles.  They were published as a consultative document on 26th June 2012 by the Basel committee on Banking Supervision (BCBS). The principles are commonly known as BCBS 239. The committee invited comments from interested parties, which are available at http://www.bis.org/publ/bcbs222/comments.htm. I co-operated with a group of fellow independent data professionals to comment and you may see our comments at http://www.bis.org/publ/bcbs222/idpg.pdf. You may see the final version at http://www.bis.org/publ/bcbs239.pdf. The largest banks in the world (known as Global Systemically Important Banks, or G-SIBS) must comply by Jan 2016. Other, “Domestic Systemically important banks”, or D-SIBS, must reach compliance three years after the date on which they were so designated, which varies by bank. Many received their designation during 2014.

While the document is targeted at risk management within the banking industry, the principles apply to all industries. The document explicitly refers to “Risk data aggregation and risk reporting” – I suggest you ignore the word risk and read it as “data aggregation and reporting principles”.

Over the next while I plan to explore some the principles proposed in the document. I plan to explore the practical challenges that arise when one seeks to implement common sense data quality management principles. I welcome your input.  If you have a specific question – let me know – I will do my best to answer it.

Risk data aggregation and risk reporting – Board and senior management responsibilities

The Queen’s Speech and Data Governance

The Queen of England made an historic and welcome visit to Ireland in 2011.  She delivered a memorable speech at the Irish State banquet, in which she said “With the benefit of historical hindsight, we can all see things which we wish had been done differently, or not at all”.

In real life, we cannot change the past.  The same does not apply to data created in the past.  Regulators now expect financial institutions to:

  • Identify data quality mistakes made in the past
  • Correct material mistakes
  • Implement data governance controls to prevent recurrences

I quote from the UK Financial Regulator’s requirement that all deposit holding financial institutions deliver a single customer view (SCV) of deposit holders:  “There may be a number of reasons why SCV data is not 100% accurate. This might be due to defects in the systems used to compile the SCV, but we would expect such defects to be picked up and rectified during the course of the systems’ development.”

Dodd-Frank, Solvency II, FATCA, BASEL III and many more regulations all require similar. Use this checklist to check if your organisation suffers any common Enterprise-Wide Data Governance Issues.

What data quality mistakes have you uncovered from the past, and how have you corrected them? I’d love to hear about them.

Do you know what’s in the data you’re consuming?

Standard facts are provided about the food we buy

These days, food packaging includes ingredients and a standard set of nutrition facts.  This is required by law in many countries.

Food consumers have grown accustomed to seeing this information, and now expect it. It enables them to make informed decisions about the food they buy, based on a standard set of facts.

Remarkable as it may seem, data consumers are seldom provided with facts about the data feeding their critical business processes.

Most data consumers assume the data input to their business processes is “right”, or “OK”.  They often assume it is the job of the IT function to ensure the data is “right”.  But only the data consumer knows the intended purpose for which they require the data.  Only the data consumer can decide whether the data available satisfies their specific needs and their specific acceptance criteria. To make an informed choice, data consumers need to be provided with facts about the data content available.

Data Consumers have the right to make informed decisions based on standard data content facts

The IT function, or a data quality function, can, and should provide standard “data content facts” about all critical data such as the facts shown in the example.

In the sample shown, a Marketing Manager wishing to mailshot customers in the 40-59 age range might find that the data content facts satisfy his/her data quality acceptance criteria.

The same data might not satisfy the acceptance criteria for a manager in the Anti Money Laundering (AML) area requesting an ETL process to populate a new AML system.

Increasing regulation means that organisations must be able to demonstrate the quality and trace the origin of the data they use in critical business processes.

In Europe, Solvency II requires insurance and re-insurance undertakings to demonstrate the data they use for solvency calculations is as complete, appropriate and accurate as required for the intended purpose. Other regulatory requirements such as Dodd Frank in the USA, BASEL III and BCBS 239 are also seeking increasing transparency regarding the quality of data underpinning our financial system.

While regulation may be a strong driving force for providing standard data content facts, an even stronger one is the business benefit that to be gained from being informed.  Some time ago Gartner research showed that approximately 70% of CRM projects failed.  I wonder were the business owners of the proposed CRM system shown data content facts about the data available to populate the proposed CRM system?

In years to come, we will look back on those crazy days when data consumers were not shown data content facts about the data they were consuming.