Increasing Data Standards within ‘The New Normal’
Against the backdrop of the dynamic ’New Normal’ landscape, there is an opportunity for banks to start investing in cleaning up their data models and simplifying them. With the potential of an extension being made to BCBS239, Nick Weisfeld (Co-Head of Data Science UK, GFT) evaluates why the focus on data standards will increase and why banks should start this process sooner rather than later.
As they become more familiar with trade reporting regimes and managing the increase of data now held inside trade repositories, internal data models could face greater scrutiny from regulators. Regulators themselves will face the difficult challenge of attempting to quickly establish how they will analyse and interpret this information, as well as deciding how they intend to use it. There is a distinct possibility that once regulators begin to make sense of the data given to them, they may see things they don’t like and discover that the data models inside many banks are actually quite poor.
It is clear however, that the data models inside investment banks have evolved in a very siloed way, while different businesses have different standards. Data is held all over the place and is sometimes duplicated. This partially explains why trade reporting has proved to be so costly for banks though it also makes clear how the required response is to initiate a massive overhaul of the systems.
The demands of the regulators
A key risk facing banks is that regulators may want to understand more about these internal models and processes. This is similar to what we have seen with the Basel Committee on Banking Supervision’s Principles for Effective Risk Data Aggregation and Reporting (BCBS239). The increase in the volume of data submitted to regulators will prove a challenge to process, as they aim to identify systemic risk and provide greater transparency in the market. It will require a lot of work to achieve these goals, particularly in terms of the technology needed.
Consequently there is certain to be an increase in a focus on data standards. This will be driven by both the regulators and senior management inside organisations. Internal drivers will emphasise the need to have reports available for intra-day decision making around asset management, financing and liquidity meaning that in the future, banks will have to manage their balance sheets and risk in a more controlled and stringent way.
Banks will also need to be able to make decisions around asset optimisation, liquidity, and funding on an intra-day basis. In most cases their current data models cannot support these requirements. The question we need to ask is whether regulators will acknowledge that this is a huge challenge for banks and will they give them sufficient time to implement any new data standards that emerge?
Recognising the value of Data
To meet the demands of this increase in scrutiny, banks will need to improve their data models. To begin with, the value of data inside an organisation needs to be recognised and promoted and data handlers need to be incentivised to value the data they produce or touch. It is rare to find a culture of producing and maintaining high quality data in financial service organisations – something which will need to change.
Additionally Banks have to improve their data governance and they should also begin by establishing a ‘data organisation’ approach that is supported and empowered by senior managers. From an organisational point-of-view there needs to be a structure that overcomes traditional boundaries and enables a consistent data operating model to take hold.
Finally, Banks need to agree on a comprehensive data operating model, along with comprehensive data standards. The most important objective of an enterprise data operating model is to elevate the value of data within the enterprise. Data communication and incentives to promote positive cultural change within the organisation should be encouraged, whilst unstructured data manipulation should be discouraged.
Ultimately regulators want to understand data model taxonomies and processes for aggregating data. Are these internal processes and data supporting or leading to better data aggregation and risk reporting practices. If these models are not up to the standard expected, regulators will want know why and what steps and strategies firms intend to take to improve models and processes.
Therefore this is a crucial opportunity for banks to start investing in cleaning up their data models and simplifying them. With the potential of an extension being made in this area towards BCBS239, the focus on data standards will only increase further, forcing banks to invest more resources in cleaning out their data models.
As a result it is imperative that banks should start this process sooner rather than later.
Find out more by reading our paper: Surviving in the New Normal.
A version of this blog post also appeared on Finextra.