Data assurance

30 September 2014

pencils

Data integrity is not a new requirement, but it is becoming increasingly important and trustees must have confidence in scheme data as:

  • sponsors push for de-risking strategies (e.g. commutation of trivial pensions, PIE exercises and buy-in/out);
  • potentially significant numbers of DB members seek transfers into DC arrangements; and
  • contracting-out ceases generating a need for GMP reconciliations.

Such activities are expected to surge early next year and members and trustees will make irrevocable financial decisions; scheme liabilities and benefits must be calculated on present, up-to-date and accurate data comprising:

  • common data (the 11 basic data requirements);
  • conditional data (all those data required to generate the benefits and contingent benefits, reliably);
  • member status (active, deferred, pensioner, dependant, transferred-out, dead); and
  • member addresses (you have to find people to be able to contact them).

What trustees should be asking

  • Are we sure our data is complete and up-to-date?
  • If not, do we know how we are going to rectify them and how long it will take?
  • What should we do now to be ready?

The lead time for data audit and cleansing can be long and will rely on resources within third parties e.g. HMRC.

Reconciling GMPs with HMRC can be long and laborious. In 2016, contracting-out will end and there will be a rush to complete GMP reconciliation exercises. The longer you leave it, the more costly and onerous it will be.

If your data has not been fully reviewed for over a year, do it now and ensure that the depth of review is appropriate. For example, a tPR style data review might not satisfy and thus secure a good price from a buy-out provider. Some such providers will refuse to quote if you cannot demonstrate current, deep and high levels of data assurance.

A good data review process will tell you how good your data is and give options and recommendations for cleansing. Cleansing does not have to be manually intensive. Modern pensions systems should afford a range of options for bulk cleansing if you have access to data manipulation and programming skills. Manual cleansing should be limited to data that require it e.g. legacy member data stored in archive boxes. Time spent on a critique of the process should ensure efficiency, cost-effectiveness and thus value for money.

In addition, do not forget the value of tracing exercises, which can be quick and painless and help you ensure, for example, that you have captured deaths (particularly among your deferred population), addresses and spouses/dependants. Sophisticated services can find dependants and their dates of birth, thereby enhancing data integrity and reducing the need to apply assumptions.

Once you have improved your data, look at ways of keeping it high through future-proofing activities, ranging from better administrative controls to the application of regular search services e.g. death alerts and tracing.

Remember, if your data is not reliable, you cannot be confident that you will deliver the right benefits to the right people at the right time; historically, now or in the future.

contact Lorraine Harper
Director lorraine_harper@jltgroup.com 020 7895 7822