Solvency II: Data Quality and Consistency

The introduction of Solvency II on January 1st 2016 will mark a critical step for regulation of the insurance sector in the EU. In the second of four blogs outlining the issues, we consider the data quality and consistency challenges confronting insurers who are seeking compliance with the new regulations.

As well as defining, documenting, implementing and monitoring processes to ensure rigorous standards of data management are followed, insurers are responsible for ensuring the quality of the data itself.

The Solvency II directive cites three key criteria for data quality which insurers must measure: accuracy, completeness and appropriateness. Legal responsibility for data quality ultimately lies with the insurer, regardless of how the data is sourced or aggregated.

There are a number of challenges and risks associated with these stringent data governance requirements which insurers will need to address:

  • Stretched resources – Most insurers lack the infrastructure and expertise to manage vast volumes of data and will need to outsource data quality considerations, creating a risk around oversight.
  • No clear metrics for data quality – There are currently no specific quantitative guidelines in place in relation to acceptable error rates; for instance, how might tolerance levels for critical and immaterial data differ?
  • Data usage must be consistent – A siloed approach to addressing the three Pillars (Capital Requirements, Governance and Supervision, Transparency and Reporting) often overlooks the need for consistency of data across all three.
  • The cost of poor or inconsistent data – Regulators can require insurers to set aside significant additional cash to satisfy the Pillar 1 Solvency Capital Requirement (SCR) if they have not been provided with the required data to classify or assess an asset correctly.
  • Additional regulatory risk – If an insurer fails to achieve detailed and consistent transparency of investment data, the regulator may specify improvements and require ad hoc reporting until these have been fully implemented.

It is vital to take steps to address these challenges in order to minimize associated business risks. If not directly managing data quality themselves, insurers should be able to refer to documentation that details the robust and systematic data management frameworks their data vendors have in place. What’s more, given that data supplied by third parties is unlikely to be 100% accurate, insurers should have a data remediation plan in place, particularly when aggregating data from a number of sources.

Having sourced and aggregated high quality data, insurers should make certain that internal data flows are set up to use this data consistently across the three Pillars of Solvency II. Accuracy and consistency will minimise the risk of regulators tying up potential investment capital by imposing add-ons to insurers’ Solvency Capital Requirement. Should an insurer fail to meet the required standards for quality, an ability to process and report data rapidly will help to address regulators’ concerns quickly.

For more on this topic, please see our latest whitepaper “Solvency II: Understanding the Data Implications”. Look out for next week’s post, which will discuss differing methodologies for calculating Solvency Capital Requirements under Solvency II.

Subscribe to Insights