In the early days of June, the United Nations Capital Development Fund (UNCDF) hosted a webinar series focusing on enhancing the benefits of remittances through data technology, inviting Marta Kuczyńska, Head of Business Development at BR-AG to share insights on using SupTech solutions for effective supervision.
What is Supervisory Technology (SupTech)?
SupTech is high on the agendas of regulators around the world. What lies under “SupTech”? Supervisory Technology, more often referred to by its shorter name, SupTech encompasses all technology solutions that can boost effective supervision of regulated entities.
With broad areas of supervision, one of the most prominent areas is data reporting, from collection to analysis. Requesting data from supervised entities can foster better analytics for decision and policymaking.
At BR-AG, we have seen this come to life through many of our projects. At the same time, we also want to focus on what’s under the surface of better data management. Data standards are what lies under the hood of SupTech designs and connects all the “data dots”.
Each SupTech project, such as building a transaction reporting system or payment system, tells a story (with local requirements and context) coded in policies and regulations. And regulations call for specific data.
When introducing policy changes or new regulations (such as those regarding payments and remittances), it is important to be able to distil the information layer and translate policies and legal provisions into data requirements, leading to designing “fit for purpose” data ecosystems. Data standards are the foundation of such data ecosystems – for data description, classification, and exchange. Analysing the requirements of supervisors, we find shared objectives in supervisors’ need for trusted, quality data.
Data standards as the backbone of the financial ecosystem
When looking at the technology solutions to automate data flows and deepen data insights, supervisors are tempted to start right away with “buzzwordy” technologies, such as AI, or tools for automated reporting. However, the devil is in the details – our advice would be to build a consistent, inclusive data reporting ecosystem.
It’s important to look in parallel at the building blocks provided by existing standards – often open and with reusable models available. The standards ultimately help central banks and supervisors to introduce automation for collection and analytics, foster connectivity of systems and transparency for new entrants such as new payment service providers as relying on a common standard enables interoperability of different platforms and providers.
When it comes to cross-border payments, achieving interoperability is said to be expensive and driving the cost of processing transactions, shared standards though, may ultimately make the whole ecosystem more inclusive and cost-effective.
In Europe, payments in EUR, (more specifically the payments that go through the European Central Bank’s system Target2) – will soon only be in ISO 20022 (an international standard for exchanging electronic messages between financial institutions). A similar path is taken in the UK by the Bank of England.
Apart from the interoperability of systems and lowering transaction costs, the ISO20022 standard carries a lot of potential for analytics of the payments data. This financial standard defines building blocks and design patterns for payment messages, there is a catalogue of existing payment messages that can be reused.
What is important is that it’s possible to enrich the data carried in such standardised payment messages – existing messages that can be made more precise for the data needs and as relevant for a given jurisdiction, country, or region.
Payments initiatives around the world
One example useful in analytics is the Purpose Codes. The ‘Purpose code’ field allows payment originators to input a specific code to denote the reason for the payment. There are messages relevant specifically for remittances as well. For cross-border payments and reporting it is worth having a look at the set of guidelines called Cross-Border Payments and Reporting Plus (CBPR+).
Such guidelines provide a common language for cross-border payments. And one promising way to apply such lingua franca is through regional cross-border payments initiatives. Many regions have focused on developing and enhancing regional payment schemes and systems to support trade and other forms of economic and financial integration e.g. West African Economic and Monetary Union (WAEMU) .
The Committee on Payments and Market Infrastructures (CPMI) is one of the advocates of the use of common data elements. Common data identifiers bring value both for interoperability and analytics of payments: such as the LEI (Legal Entity Identifier) and UPI (Unique Product Identifier), UTI (Unique Transaction Identifier).
These are all parts of the common language for systems, service providers, supervisors, and other stakeholders to better analyse the data flows and ones that may be considered when shaping regulatory frameworks – for banks, payment service providers, micropayments remittances and other – that will call for such data.
Taking a step back is as necessary as moving forward
Data standards are vital to ensure interoperability in payments systems. In the long term, they significantly reduce the cost of processing payments and allow regulators to supervise the market effectively at a lower cost. Even though new technologies can be tempting, we shouldn’t forget what is most crucial to their implementation – structured data.
We have seen this through a vast majority of projects we have participated in, such as developing the sustainability reporting concept and supporting the development of the European Single Access Point (ESAP). We have also developed our own cloud-based platform to map complex regulations into precise and easily navigable data concepts – ATOME Matter.