By Phil Flood, Global Business Development Director, Regulatory and STP Services at Gresham Technologies
Tolerance for poor data quality and reporting errors is waning fast; regulators are cracking down. And with regulatory complexity and scrutiny on the rise, the time for turning a blind eye is over.
But despite the increased regulatory focus, the desired effect – reduced reporting errors and higher data quality – has not been achieved. This begs the question: is fear by itself enough to compel financial institutions to up their game when it comes to data quality?
Ensuring high-quality, accurate data has never been more important to the reporting process – or less straightforward – for financial institutions across the globe. As a result of manual processes, legacy systems and siloes, complexity is rife and mistakes are commonplace. And these problems are only exacerbated by low data confidence – as a result of regulatory divergence across jurisdictions.
There are two main approaches for motivating firms to achieve regulatory compliance: the carrot and the stick.
This stick often comes in the form of fines. According to ESMA’s Sanctions report, fines imposed by National Competent Authorities (NCAs) under MiFID II more than quadrupled in value in 2020, reaching an aggregated €8.4 million (comprising of 613 sanctions and measures), compared to just €1.8 million (371 sanctions and measures) the year prior.
Yet data integrity and reliability has not improved in parallel. ESMA’s EMIR and SFTR 2020 data quality report, released in April 2021, highlighted this fact in detail for the first time since the European Market Infrastructure Regulation (EMIR) came into effect over seven years ago.
Under EMIR requirements, around 7% of daily submissions are currently being reported late by counterparties. Additionally, up to 11 million of open derivatives did not receive daily valuation updates, and there were between 3.2 and 3.7 million open non-reported derivatives on any given reference date during 2020. Approximately 47% of open derivatives (totalling circa 20 million) remain unpaired.
And these problems are further compounded by firms’ attempts to re-use existing legacy solutions already prone to data quality issues. This can be seen in many firms’ approach to SFTR, a regulation that many have viewed as a close enough cousin to EMIR to simply hit copy and paste.
One thing is clear however from these ongoing challenges: while the stick does have a role to play, on its own – it is not enough to address the issue of low data quality.
Rather than financially penalising financial institutions for poor data quality and regulatory reporting, regulators may find that helping firms realise the potential that strong data integrity can bring – such as reduced costs, increased efficiencies, and a more competitive business offering – may prove more effective in encouraging the prioritisation of data integrity at the C-suite level. This is where the carrot comes in.
That’s not to say that the carrot and the stick must be mutually exclusive. Indeed, to ensure financial institutions adhere to high data quality standards, regulators should employ both the carrot and the stick in parallel in order to maximise results.
By proving the benefits of high data quality, regulators can help financial institutions get on the front foot in cases where the threat of fines is not enough. Above all, regulators must clearly convey that the businesses that will prove successful will not be those acting out of fear, but out of ambition.