Why big data will be so valuable in future
Jon Cooke, big data specialist at GFT’s Rule Financial, analyses the current state of big data in the financial services industry. As he looks for key areas where banks are developing big data technologies, he has identified two significant factors: regulatory requirements and more detailed information on customer behaviour.
In an ideal world, Big Data should be an enabler for financial services firms to have greater insight into their business. It should enable investment banks to have greater visibility of their trading activity, thereby being able to act quicker and more effectively to deploy new strategies; whilst retail banks should be able to fully understand their customers’ behaviour and be better equipped to establish a more personal connection with them.
However in reality financial services firms are clearly at different levels of development when it comes to adopting Big Data technologies that will enable them to achieve these goals. In fact, across the industry less than 10% of banks have Big Data ‘use cases’ in operation. While some individual firms may use Big Data technologies to inform their business strategy; this can’t yet be said for the industry as a whole.
Those firms that do appreciate the commercial value and business benefits of Big Data are mainly focused on two key areas. Firstly, on meeting their regulatory requirements, such as the Volcker Rule obligations to accurately report and segregate proprietary trading from customer trading, or meeting the Basel Committee on Banking Supervision (BCBS239) requirements for data risk management. Secondly, firms appreciate the value that better intelligence can have on improving their customer relationships. A better understanding and appreciation of customer behaviour can have a positive and quantifiable impact on sales and engender greater customer loyalty.
A significant advantage of Big Data technologies is that they can be adopted for a variety of business units; and different departments across large banks can customise the right Big Data technologies for their business requirements. For example, a Head of Operations in a global organisation may need to measure KPIs across the whole enterprise in order to achieve cost savings. Big Data architectures are needed to cope with the massive amounts and diversity of enterprise transactional data – tens of billions of rows of data – across many different business lines.
We also see Big Data approaches dovetailing with traditional data management systems across the enterprise. Integration with services such as asset class trade repositories and reference data are integral to any data solution in banking.
Big Data needs to align with master data management, data lineage and data governance solutions, so that businesses have a consistent, accurate and complete view of their data estate across individual business lines as well as across the business as a whole. If the data stored in Big Data platforms differs from the ‘golden source’, or is stale, or doesn’t relate to the banks’ existing data, then it won’t be trusted and therefore won’t be used.
Organisations are still struggling with traditional data challenges, especially in getting consistent, accurate and group-wide views of their data estate. They are challenged by the standaridisation of data across front, middle and back office, as well as implementing data policies and standards across the group, and enforcing group-wide data governance practices.
So, where are we with Big Data? While some firms are embroiled in taking on its grand and complex challenges, others still need to come to terms with its very great significance and grasp the opportunity that awaits them. It is certainly an opportunity worth seizing, but still has a long way to go.
More informations about GFT Blue Paper: Big Data – Uncovering Hidden Business Value in the Financial Services Industry
You can also see more information on GFT’s Big Data solutions by clicking here.