Better Data Part II: Be effective
The importance of an effective data governance programme
The financial industry collects enormous volumes of data every day, yet progress in the area of data governance has been slow. We examine the tangible benefits of improving data management, showing that an enterprise-wide data strategy drives operational efficiencies, controls costs, meets regulatory commitments and allows firms to make data-driven business decisions. The pressure to implement effective data governance programmes is becoming increasingly prevalent as financial institutions struggle to cope with the volume, pace and concurrency of regulatory change.
The disruptive influence of regulation is forcing financial institutions to re-evaluate and change the way they do business. As margins shrink across the board, the new regulatory environment is forcing firms to assess which area of their business requires the greatest priority. Unfortunately, while managing their business-wide data, there is a necessity for robust regulatory compliance. The case to take on a lengthy (and often expensive) data management project can be difficult to make due to the lack of data quality metrics. As a result, the current situation is that many organisations have thus failed to take even the first steps in establishing an appropriate data governance programme.
Financial institutions have a number of critical questions and challenges ahead of them in terms of how they will succeed in this new era of continuous regulation. How do they get the support to change and implement a complete data culture within their organisation? How do they overcome the technological challenges of updating and replacing legacy technologies to help aggregate and store data to meet all of their business requirements? Despite initial efforts, there remains a long road ahead before we see the type of data governance within firms that is truly needed. A strong and robust framework is crucial in preserving the integrity and consistency of an organization’s data, tying back to the importance of data’s function within an enterprise.
Key drivers in data management
Regulation remains one of the most prominent drivers of data management development across the financial industry. With the increasing number of regulations in place, banks are required to answer more and more onerous questions posed by national and regional regulators.
For example, BCBS239 is one of the most significant regulations driving data management that exists today. By complying with the 11 Principles enshrined in BCBS239, financial institutions can drive better aggregation and management of their data, resulting in improved governance. However, while such regulations have the potential to help firms make better data decisions, few are jumping to invest in a proper data governance program and instead, rely on ‘tactical fixes’ to achieve compliance. Unfortunately, without systematic data quality controls in place, firms may have difficulty in getting added value from this data besides simply ‘ticking the box’ to meet their regulatory obligations.
This approach needs to change if banks are to think more strategically in terms of their approach to regulation and data management. The benefits are not simply one of meeting regulatory obligations; despite initial costs, better data management will provide a secure platform for firms to make improved business decisions allowing them to gain a competitive advantage over competitors.
Establishing effective data governance management
The first step in taking on an enterprise-level change programme involves finding the right project champions. Financial firms need to establish an effective data management organisation and assign accountabilities to various individuals. A data governance overhaul project is a huge undertaking and those with strong influence such as the Chief Data Officer, need to bring to life the ‘vision’ of the end state of the project. The CDO must also consider the practical realities of what a new governance framework will do and what it will look like. The project plan must address specific issues that relate directly to the specific financial institution, as this will make it easier to gain support across the entire firm and drive it forward.
There is a great deal of complexity, duplication and redundancy inherent within the data held by financial institutions. For effective data governance to become a reality, it must part of a coherent strategy for data, asking questions such as: Where do we want to go with our overall strategy? What ‘App’ topology do we want to achieve across the enterprise? Data is an enabler for this.
Rather than tackling the data challenge all at once, firms are more likely to achieve success if progress is planned in an incremental way by identifying the areas that will have the most material-impact. For example, there are many common findings that can be found across all reports or applications, considered as ‘building blocks’: reference data, positions, transactions etc. Firms need to identify a key material business area they want to tackle and then create a plan for progress:
- 1. Incorporate or refine the business area in the organization’s ‘Lingua Franca’ (detailed further below)
- 2. Align other applications or ‘consumers’ of data to this main store
- 3. Expand on it accordingly for each data requirement
Unfortunately, many organisations have yet to take the required first step of realising they need to establish a data governance organisation. They see the symptoms, such as data complexity and the manually intensity of manipulation, but do not see the systematic solution – a true data warehousing strategy. With the right organisational structure in place and a well-planned set of steps with efficient and reusable processes, firms can overcome the challenges and be successful in their goals.
The initial data project in any firm will take at least six months, and potentially up to 12 months. As the initial project is completed, following projects will only last around three or four months. The key to shortening a project’s duration is to scale the process in an incremental way that can be reused and then expand from the initial project.
The need for a ‘Lingua Franca’
Effective data management requires appropriate ‘labelling’, such that risk and financial data are classified and labelled using terminologies and language that are commonly understood. A suitable common language or ‘Lingua Franca’ must be established with the following criteria:
- Clear and consistent labelling of risk and financial measures – across P&L, P&L attributions, risk sensitivities and hedge metrics, plus numerous risk measures and metrics. Whether at individual trade and position levels, or intermediate aggregation points, measures and their properties (e.g. whether additive or not) must be consistent.
- Consistent and complete coordinate information and static data – reference data such as trading books, portfolios, accounts, counterparties and legal entities must be standardised and ascribed to risk and finance measures, and must be ‘time versioned’ to support historical views.
- Consistent conventions for the calculation of measures – including standardised sets of tenors, volatility coordinates for vega, correlations and other attributes, which must ‘line up’ in order to facilitate meaningful aggregation.
- Time and version tagging – of measures to ensure that the correct data sets are retrieved and aggregated. For instance, these might include official global or regional end-of-day, end-of-month or unofficial flash, etc.
- Metadata to support data quality monitoring and analysis – which can ‘bubble up’ though aggregations to highlight problems and/or incomplete aggregations, and enable the explanation of anomalies.
- Training – people have to be retrained to make this a part of their role, or they will fall back into non-compliant behaviour. This means you have to have the infrastructure in place to manage this behaviour.
- Infrastructure and quality programmes are behind the times – one major problem is that senior governance time at most firms is approximately three years. The maturation process is affected by attrition and change. This corroborates what we have observed with clients and in our research survey results.
- These processes need to be aligned with IT process within the firm – we would advocate that data governance process needs to be part of your IT process, once you get it up and running.
The role of the chief data officer and data stewards
The Chief Data Officer (CDO) is the key enabler and partner for infrastructural change programmes. CDOs must be strong leaders who can create the vision, frame the initial definition of the organization’s new data policy and then lead the evolution and enforcement of the policy. CDOs along with Data Stewards must champion the data cause throughout their organisations and communicate the value of these programmes. As the change programme is implemented and executed, the role of the CDO will evolve from being the enforcer of data policy to being the enabler and supporter of business and functional change. This is best achieved by exerting the appropriate influence over practical delivery.
The challenges ahead
The new implementation of the data governance framework must ultimately be able to comply with multiple and concurrent regulations. A pragmatic vision must be defined with measurable goals set in place to address this. Potential challenges such as overcoming political resistance and departmental boundaries will require Data Stewards to exert policies and their own influence on their organisation’s people and processes.
Enterprise-wide frameworks need to break down and remove the business silos that found within financial institutions. Different business areas may have differing views and cultures with their own data models, processes and infrastructure set and the main challenge would be to address this in a single data governance framework.
Conclusion
A strong and effective data governance programme can dramatically improve the decision making of an organisation but requires the availability of high quality data.
The forefront of any data strategy would be to treat data as a prized ‘organisational asset’ and instilling a culture of understanding. Another key foundational component is implementing a lingua franca.
Taking steps to achieve the next level of data governance will require a better understanding of the data types needed by the firm, for day-to-day efficiencies, reporting and regulatory compliance. The champion frameworks will be those that have a single data landscape used for both internal and external end users.
In conclusion, the financial industry has been collecting vast amounts of data for many years, but has been slow to realise the value that exists in the data itself. With new data-driven organisations such as Google, Amazon and others demonstrating the value of data, financial firms need to rise to the challenge of improving their data governance processes to provide a holistic view of the business. This will enable firms to create operational efficiencies, better decision-making and make meeting new regulatory commitments easier.
Go to first part of this series.
Find out more about the authors: Michael Engelmann, Ritesh Koli and Alan Morley.
GFT data governance downloads:
Data Governance Fact Sheet (download opens automatically)
Data Quality Infographic (download opens automatically)
Big Data White Paper (download opens automatically)