Software requirements for Solvency II

Share this on your networks

Software requirements for Solvency IIAccurate transfer of data is critical in the highly complex and fast-moving Financial Services sector. After its introduction in 2014, the requirements of the Solvency II directive increases the need for the timely transfer of data to the regulatory authorities. This guide outlines how data quality and data integration software can be optimised to meet the Solvency II requirements.

A quick overview of Solvency II

Solvency II redefines the required capital for insurance companies to operate in the EU. The aim is to reduce the risk of insolvency and increase consumer protection. The framework has three main pillars:

  • Pillar 1 defines the capital requirements for an insurance company, such as the amount of capital the insurer should hold;
  • Pillar 2 outlines the requirements for the governance and risk management of insurance companies;
  • Pillar 3 focuses on disclosure and transparency requirements.

For a company to adhere to the Solvency II requirements, its data strategy for the three pillars must be detailed and robust. Low data quality and poor transfer strategies increase the potential of inaccurate risk capital calculations.

Pillar I directives

There are two possible approaches to collating the information available for calculation of Pillar 1’s Technical Provisions (TP) and Solvency Capital Requirements (SCR):

  • Wrap the data around the directives. Centralising information into a single data store allows calculations and audits to be performed against the store;
  • Wrap the directives around the data. This involves ad hoc consolidation of information, as required, to perform the necessary calculations and audits.

Both approaches have similar software requirements. The main difference is that the ad hoc option requires the data to be migrated and consolidated on a frequent basis to allow the calculation of the TP and SCR.

Data migration requirements

The optimal data migration software capabilities include:

  • Data connections to the various silos, with clean, transparent mapping logic to define business rules and system level rules;
  • Access to all data source types including databases, XML, flat files, Excel, and custom structures with generic adapter capability;
  • Manipulation of data and structures including:
      • De-duplicating records based on surrogate or natural identification
      • Addition and manipulation of relationships within the data models for access to data and performance gains;
  • Data cleansing abilities.

Data quality requirements

Data quality checks should be carried out prior to migrations to assess the current quality of the data and highlight any issues. Software should be able to perform data verification during the migration, validating the data against business and system-level rules. Ideally, the software should also use these rules to validate data as it transfers between internal data silos or external third parties.

Example rules:

  • The value date of a policy must be after the birth date of a customer;
  • The value must match an enumerated list, such as gender of a customer being male, female, or a company;
  • An insurance policy book code does not match any entry from the deal book table;
  • The type of policy must be coherent with the value of the premium periodicity.

Data requires continuous verification: information can contain a multitude of anomalies when entered by users, or when transferred between systems or from third parties. Accuracy, range, and structural verification are a few of the checks required to maintain fit-for-purpose data.

Data reconciliation requirements

Ideally, the software should provide reconciliation of information by consolidating information directly from the data stores and comparing that against official data records.

Logging and error reporting requirements

During data migration and data verification, logging and error reporting is essential. It should be defined to provide the user with detailed information about data that has failed to adhere to the business and system rules.

Pillar II directives

Using specialist software to implement parts of the enterprise management system demonstrates that dedicated tools are being used to handle the data. This shows the authorities that the company has a stable platform for data migration, data verification and data cleansing.

For maximum efficiency, it is important that the software provides future proofing by:

  • Updating model structures as new versions of central stores or the various data silos are created;
  • Making change requests to the business logic in accordance with future directives;
  • Connecting to new systems when new data silos are brought online, or when existing silos need to be joined to access more data as the directives progress.

Using a specialised tool to provide data management services creates a high level of confidence about the process being created and implemented.

Pillar III directives

The data management software should provide a clear, stable platform for providing reports to the regulatory body. For instance, our Transformation Manager software uses its migration functionality to generate XBRL files composed of data from the centralised store or workflow data. This involves:

  • Consolidating data from data stores to populate the reports and automatically formulate it into XBRL based on business rules; or
  • Formulating data into XBRL directly from a single source based on business rules.

Using a specialised tool like Transformation Manager reduces risk by addressing low-level core component issues directly. This enables timely intervention and correction, preventing costly reruns further down the line.

Further reading

    Did you like this article? Get our new articles in your inbox with our occasional email newsletter.

    We will never share your details with anyone else, except that your data will be processed securely by EmailOctopus (https://emailoctopus.com/) in a third country, and you can unsubscribe with one click at any time. Our privacy policy: https://www.etlsolutions.com/privacy-policy/.