Providers promise huge advantages of expansion and agility, with digitalisation and migration to the cloud being vaunted as the ultimate keys to growth and efficiency.
Hyper-scale cloud infrastructure providers such as Amazon and Google are developing financial offerings that catapult the role of cloud far beyond plug-and-play Software as a Service, or massive market data processing and storage.
The global reach and capacity of the big tech houses provides undoubted appeal to organisations that wish to enter new trading venues around the world, and there has been a flurry of activity as the tech giants recruit financial market-savvy executives to help with their positioning.
Meanwhile in the UK the Governor of the Bank of England Andrew Bailey is enthusiastically promoting ‘buying’ rather than ‘building’, highlighting banks’ lack of IT talent, time and infrastructure expertise.
Nevertheless there’s a strange tension in the financial sector as firms still puzzle over the question of ‘Building vs Buying’. Their necessary risk-averse nature persists – and rightly so. The increasing amount of regulation surrounding the financial sector means that all actors in the space need to tread very carefully. Could there still be a strong temptation towards proprietary development in such a security sensitive and highly regulated industry?
The pain of ‘Build’…
On-prem infrastructure comprising locally managed data centres would historically be the solution of choice for organisations needing a high level of regulatory compliance and security. This would lead to proprietary infrastructure builds, requiring huge investment in real estate, hardware, connectivity, power, DevOps, monitoring, maintenance, and support – as well as expert personnel.
Gordon McArthur, CEO of Beeks Group, the leading specialist cloud infrastructure provider in the Capital Markets sector, has dedicated the last decade of his career to creating technical solutions that support and address the core needs of Capital Market clients. “We know very well the pain that financial businesses have gone through in attempting their own infrastructure builds,” he says.
“So many banks and brokers have spent millions on platforms that never got off the ground.”
The end-to-end process of implementing a proprietary environment, whether on-prem or cloud-enabled, is full of pitfalls and challenges. As McArthur outlines: “The costs and risks involved are prohibitive. The time-to-money can be anything from 18 months to 3 years, depending on knowledge, resource capacity, and location accessibility. It’s not just the CAPEX of the infrastructure, but the on-going hosting and connectivity costs. Built connectivity is expensive, time-consuming and demands a lot of technical know-how which is difficult to source in-house. A lot goes into procuring, configuring, maintaining and monitoring the hardware and network infrastructure – and it certainly isn’t core business for financial organisations.”
… the relief of ‘Buy’
Outsourcing system and infrastructure builds has been a prominent option to relieve the pressure on IT departments in the financial sector.
Financial businesses cannot afford to stand still. Buying in DevOps, even while they continue to run their own data centres, means organisations can release some of their development responsibility while keeping their networks, operating systems, applications and data tightly under their own control.
But for every bank that has dug into its own proprietary environment, there are a growing number of financial institutions that are outsourcing their infrastructure to third party cloud hosting.
In the US the Financial Industry Regulatory Authority has moved all its technology into the cloud, while banks are saving hundreds of millions of dollars a year by closing their own data centres and renting co-located space, capability and capacity.
The trend to buy in more than DevOps is on the rise. Outsourced Managed Service Provision is increasingly an option in financial circles, as it is in other sectors of enterprise. However, from his conversations with banks and larger financial institutions, McArthur is aware of the biggest sticking point to financial firms going all-in with outsourcing. He says: “While big institutions love the flexibility and agility of cloud, they still bump their head on the issue of internal control.”
Public Cloud Challenges
While the reach, versatility and seemingly bottomless resource of public cloud is an attractive option, there are several risks and limitations that Capital Markets firms need to be aware of:
“Public cloud infrastructure cannot provide ultra-low latency,” McArthur asserts.
“Capital Markets demand micro and nanosecond timings and data multi-casting as standard essentials, but these are still lacking in the public cloud.”
Regarding control and security, the twin pillars of regulatory nirvana for the financial sector, McArthur says: “One of the major challenges of the public cloud environment is that it is fundamentally a massive, shared network where financial institutions have no control over security, data access and data sovereignty.These may not be so significant in the generic cloud, but they are show-stoppers in the financial sector. Banks and trading firms are very nervous about having their environment wholly controlled and managed by their public cloud provider rather than the organisation’s own protocols.”
Furthermore from the point of view of performance analytics, McArthur couldn’t be any clearer: “It takes years to develop the breadth and depth of trade analytics that Capital Markets businesses require. “Analytics like these are very expensive to buy in on their own and you can’t really build them yourself. There are very few providers in the world who offer the sophisticated performance analytics needed. And you can’t get them from the public cloud.”
Data Location Certainty
Closely related to the firms’ need for security and regulatory control is the certainty of where their data is being held. It is not always possible in the public cloud to ‘point to’ the physical location where data resides. In addition, should data cross borders for reasons of cloud service performance optimization this could introduce regulatory breaches and liabilities to data owners.
Beeks’ Group’s Chief Sales Officer Alan Samuel knows how important data location certainty is to Capital Markets firms when selecting cloud infrastructure provision. While the flexible optimisation of generic cloud is attractive, it may not deliver the certainty needed to comply with regulations.
“Increases in regulation have led to a huge burden on financial companies to keep a lot of data – up to 7 years of it in fact,” Samuel says.
“Some of the data is very static, but a lot of it will be dynamic and the client will want to access it
and this will only increase as AI and Machine Learning become prevalent.Therefore having certainty about where the data resides and having ease of access to it is of critical importance to them.”
There are clearly many things to consider when deciding whether to build or buy – not least the technical capability of different cloud infrastructure platforms, and how ‘Capital-Market-friendly’ they are. It appears that the question of building vs buying is at best over-simplified, and at worst anachronistic. In the words of Andrew Bailey: “Why would you build it if you can buy it?”
Maybe the more pertinent question for Capital Markets to ask is: “Since Building is costly and risky, who do we Buy from?”
Beeks Group’s contribution to Capital Markets businesses
Beeks is experienced in matching the appropriate cloud service to its capital markets clients in financial hubs around the globe.Offering comprehensive Infrastructure as a Service private cloud access, 24/7 support and monitoring, along with flexible payment plans to readily scale business up or down, Beeks helps customers focus with confidence on their core competencies. Beeks latest offering, Proximity Cloud, replaces all shared infrastructure with a dedicated, client-owned environment that can be deployed wherever the organisation wishes.“Proximity Cloud goes one step further than even the standard private cloud implementations,” comments McArthur.
“It enables data to sit within a customer’s own physically held location, so we can absolutely point to where it is under the customer’s specific physical and logical security protocols.” Reflecting on Beeks Group’s contribution to Capital Markets businesses since 2011 McArthur says:
“We’ve spent the last ten years optimising low latency environments. We do it quickly, more cheaply than building or outsourcing development, and we give customers greater flexibility.
Everything we do is around supporting one sector of industry that is Capital Markets, rather than trying to be all things to all people as the public cloud tries to be. Our customers can pick from 250 venues around the world and achieve connectivity today. They benefit from ultra-low latency networks, optimised compute and performance analytics all pre-integrated, fully supported 24/7, and ready to go from day one.” Meanwhile Samuel adds: “We act like a utility. There’s no need for our clients to spend time and money on building facilities that we provide as standard. Instead of a multi-year commitment we also offer a month to month, pay-as-you-go subscription model, without compromising any of the security and performance guarantees that are expected in the financial space. This releases our clients’ highly paid, expert staff to add value to their company at the execution level.”