So what is new with the cloud?
Eighteen months ago, we produced a report on the state of cloud adoption within global Financial Services. Given that we are at the start of a new year – and decade – I thought it would be interesting to revisit that report and see what has changed and what has remained the same. For one thing, cloud has moved up the agenda for many financial organisations, but not always for the right reasons.
In the report we said: “banks are currently only making limited use of private cloud, but are doing enough to explore and understand general cloud issues. Public cloud usage at the moment is generally very low amongst all financial institutions, although all banks claim to have strong growth plans over the next 5 years”.
In many ways, although cloud has moved up the agenda, there is still little in the way of large-scale migration occurring. There are a couple of household name banks that are moving to cloud, but production usage is still limited. Some of the tier 2 institutions have had more public success – we said in 2018: “tier 2 banks are more adaptable to such change than the tier 1s. This is due to a number of factors – tier 2 banks do not have the legacy and technology complexity of tier 1s; they are more nimble and aggressive, and some of their key (third-party) applications are already being ported to the cloud by software vendors”. This still seems to be the case and I think it is also now true. But this success is not across the board of all tier 2 banks.
We said in 2018 that SaaS was the most widely used technology, followed by PaaS and IaaS. Our view was this was largely driven by the early adopters amongst tier 2 institutions taking advantage of vendor migrations to the cloud. What has been very obvious over the last 18 months is the lack of progress on PaaS and IaaS. There have even been some prominent failures amongst institutions when it comes to building out PaaS and IaaS solutions. For organisations that started cloud migrations a year or two ago, the low hanging fruit that is ‘Lift & Shift’ has been harvested and now the more difficult workloads need to be moved – and this is where the problems are being encountered.
Whilst many institutions initially considered their existing workloads for migration, we now see more greenfield projects starting, or the re-engineering of existing platforms since of course, this is where most people see the value of cloud – in new applications and new ways of working. It has also been very clear that moving or building large-scale production systems to the public cloud has proven to be an underestimated task. We have seen there have been many blockers and issues to be overcome. Not least, these projects have been underestimated in terms of the complexity, the number of technology options, the amount of foundational work required, and the lack of skilled battle-tested staff to carry out the work.
Back in 2018 we observed that cost was not the main driver in moving to the cloud, it was: “flexibility, capacity and agility of cloud” that were the main drivers. Since then, we have observed a more nuanced take on this – clients want to actually emulate the culture of the cloud companies in terms of innovation, productivity and staff retention. They believe by using native cloud technology, this will help them on their own cultural transformation journey.
However, as previously mentioned, the lack of progress on building secure, repeatable, low-touch PaaS and IaaS platforms is holding them back. We said previously: “The ability to rapidly add computing power for front office / risk requirements, and to scale it back when not required is seen as being particularly useful” – however this has proven to be extremely difficult for nearly everyone we speak to. Not just in terms of adding additional capacity to an existing application (in fact that has not been too hard), but being able to rapidly provision application infrastructure when the business or clients require it has not been achieved. This is because many companies have not been able to sufficiently automate their processes and still rely on existing manual provisioning and approvals for their cloud infrastructure.
One other area that has seen some, but limited, progress is addressing regulatory concerns. Last time around we said: “Data protection, driven by the new GDPR regulation in Europe, and the ability to see where all data is stored is also deemed to be important”. In fact, this is still an issue that has only just been addressed by cloud vendors – or is currently in the process of being addressed. The ability to geofence data is a feature that has come along only quite recently, and there is still work required to fully address the concerns raised 18 months ago.
Our prediction, back in 2018, was that: “usage of public cloud is set to expand rapidly as firms overcome their reticence and embrace the cloud benefits of scale, speed and efficiency, with the advantages outweighing any risks or concerns. Fully adopting cloud involves many different areas, from new, agile ways of working through to the actual native cloud technologies themselves”. This prediction turned out to be a bit premature, albeit that many companies tried to accelerate their migrations, but become stuck in a quagmire of change, culture clashes and lack of experienced staff. We did at least give one caveat to our prediction: “the challenge being to find suppliers (and staff) who have the breadth of new skills in the numbers required”. Many customers have attempted to build out the foundational cloud components required, but have become overwhelmed by the size of the task. Interestingly, this is not specific to any one industry – many corporations across all sectors are struggling with the challenge, and in retrospect, maybe it is not too hard to see why?
You can find out more about GFT’s cloud technologies expertise here.
In Part II of this blog I explore how you can overcome the challenges of cloud adoption, including solutions such as creating a ‘Datacenter as Code’ and how your firm can gain access to the most talented staff.