Economies that embrace data sharing adoption of open-data ecosystems for finance could see GDP gains of between 1 and 5 percent by 2030, with benefits flowing to consumers, financial institutions and authorities.
Going beyond greater convenience and easier interactions between financial sector incumbents, open data also helps to fast-emerging challenges for the financial sector, such as Environmental, social and governance (ESG) disclosure, anti-money laundering (AML) and counter-terrorist financing measures (CFT).
Access, use and share: Trifecta of the Open Data
Open data standard is the inflection point that’s changing the gravity of supervisory and regulatory authorities, helping them to unleash the full potential of open-data ecosystems. By open data, we mean the ability to share financial data through a digital ecosystem in a manner that requires limited effort or manipulation. In an era defined by rapid technological advancements and a fast-evolving business and regulatory environment, supervisors and policymakers having access to good quality and timely data is the key to the sustainable growth of the economy.
Navigating in crowded data standards landscape, including SDMX, DDI, ISO 20022, XBRL / DPM standards designed for different purposes, the big question is: how to make them interoperable and effectively integrate within the data value chain? To what extent could such interoperability go beyond greater convenience and easier exchange of information within government departments?
These and other topics were at the centre of discussions held recently at the 9th SDMX Global Conference under the theme ‘Empowering Data Communities’, co-organised by the Bahrain’s Information and eGovernment Authority (iGA) and a group of international organisations including the United Nations (UN), World Bank Group, International Monetary Fund (IMF), Organisation for Economic Co-operation and Development (OECD), Bank for International Settlements (BIS).
The struggle for simplicity: Learning from the nature of LEGO bricks
LEGO bricks serve as an intrinsically fantastic tool for building—and, by extension, an analogy for expanding and experimenting with new things while playing—because they are organized and predictable, yet simultaneously opening the door to limitless innovative possibilities.
What if we apply this approach in encompassing different standards and data definitions?
Moreover, applying a similar philosophy to the realm of standards involves taking a step back, recognizing that at their core, many of them share numerous similarities, approaching the subject in a very similar manner, if not the same. By acknowledging these commonalities, we can proceed to comprehensively embrace the standards extracting the maximum value out of them to truly elevate data governance frameworks.
With this message, Michal Piechocki, Chairman of the Board at BR-AG, approached the community of official statistics compilers and users from national, regional and international agencies, academia and the private sector during the capacity building session held jointly with Barend DeBeer, Senior Economist, South African Reserve Bank, within the 9th SDMX Global Conference.
Numerous studies prove that our brains are wired to be drawn to things that make sense, and to shut down when presented with confusing information. Otherwise, our brains are literally wasting calories trying to process it. Nothing needs to be this complicated – the daily lesson kids are teaching us.
Observing kids playing with LEGO, we may learn a lot from the simple yet clear minds of young explorers and start to think outside of the box. What is interesting, the concept of creation comes naturally to them as they do not classify or structure the bricks but use them as needed and make some classification intuitively at the moment of usage. One of the most fascinating parts about LEGOS is its seemingly infinite variety of pieces—each with different markings, colors, sizes, shapes, and purpose. Moreover, building with LEGO takes invention and a logical, structured form of creativity to build different purpose-built structures, but the focus is always set on finding the right fit for every piece.
Bringing LEGO analogy to the management of supervisory and prudential statistics, payment statistics, financial market and other types of data that supervisory authorities manage, it is worth looking for similarities in the simple building blocks across standards and definitions.
At the same time, as authorities’ data sets expand exponentially, their need to harness and channel such vast insights in the right direction is becoming more pronounced. The role of artificial intelligence (AI) or machine learning (ML) in this process is adding an extra layer of importance. It became clearer than ever – as data expands and the infrastructure needs broadening, a new Lego brick can simply be added. That is the idea of LEGO simplicity which might be brought with the help of Data Standards.
The South African Reserve Bank (SARB) experience shows a great example of authority encompassing different standards in one organisation. While the DPM and XBRL standard is already adopted and well-known to SARB’s teams, they journey towards SDMX the Economic Statistics Department (ESD) has just started and authority is gradually building capability in this area.
Bringing more context, DPM is a standard (defined by ISO 5116) developed to enable structured data collection and reporting in the financial industry. It provides a common framework for defining data points, their relationships, and associated metadata. It helps SARB to ensure data consistency and accuracy, making it an integral part of regulatory reporting and data analysis.
You need a Castle to become a King
Veering away from the idea, rooted deeply even among IT specialists, that independence is efficient, supervisors are down the road of building a harmonized, connected approach in treating digitising data and leveraging multiple data standards.
The wishful thinking on efficient independence of different teams or data standards operating simultaneously quickly get out of control upon the realization that you might have to duplicate efforts towards different standards, working with data sets to innumerable new building blocks – with no real concept of an end tower in mind.
At this stage, not only is time and money being wasted, but the likelihood of making the right data connections is also put at risk, given the advanced level of complexity each data standard possesses, including the massive amount of unique differences between standards that should be taken into account.
Our ATOME Platform was designed to serve as a common ‘castle’, an area where all standards like XBRL / DPM, SDMX and more find a common layer and internal teams a space to effectively collaborate. Such a powerful sync positions the respective data authority to come as a ‘king’, reigning over the harmonised, interoperable, and accessible data flexibly, on its own terms.
Learn more about our related Data Standardisation services and the approach we apply.
Discover more about ATOME solutions and see them in action.