Investment in data and analytics has consistently increased since 2019. A Coalition Greenwich study that year highlighted that 45% of respondents planned to significantly increase their investments in data management technology. Fast forward three years, and it’s safe to say firms have not realised data’s full potential.
Paul Lambert, CEO of New Change FX (NCFX,) has a few thoughts on why. “Having worked for both large fund managers and banks, I would say the biggest barrier to effective data management is legacy systems,” he says. “These companies are complex, and their development has usually been organic resulting in a siloed technology stack. Many of the legacy systems have been designed for specific functions and without thought of them communicating with other systems, particularly systems outside the organisation.”
Lambert further notes that the idea that core systems must generate data that can be efficiently collected, cleaned, combined, and distributed is a new one. Given this state of things, how can institutions move forward, and adopt analytics-driven workflows?
Data management is a strategic consideration
“Data analytics holds huge promise for FX market participants – but before any of this potential can be realised, firms need to start with a solid foundation of properly normalised and enriched transaction and market data,” says Matthew Hodgson, CEO of Mosaic Smart Data.
“With it, firms can maximise the value of the data flowing through their organisation and transform it into a powerful source to enhance efficiencies and profitability. Without this key building block, analytics can only deliver partial results.”
FX stakeholders, especially market-facing firms, face quite a few challenges when trying to normalise their data. Guy Hopkins, founder and CEO of FairXchange highlights the state of affairs. “Data and analytics have historically been the domain of a few,” he says. “Sales and Trading ask Strats to run custom reports or run analysis on specific queries which is wildly inefficient and completely unscalable. Also, data sets are vast and storage is cumbersome on legacy infrastructure.”
Furthermore, each network or platform uses different protocols for recording information, leading to incomplete datasets. Firms cannot rely on a single data export and must spend time filling in the blanks in these outputs. Firms must also reformat these datasets to suit their systems. The bottom line is that firms must spend significant time executing these tasks when they might not have the expertise to do so.
“Having worked for both large fund managers and banks, I would say the biggest barrier to effective data management is legacy systems,”Paul Lambert
There’s also the bigger question to answer: What is this effort in aid of? What is the ROI of spending time arranging and classifying data like this? “The FX markets have operated on a data ‘haves and have-nots’ basis for a long time,” Lambert points out. “This applies to both buy- and sell-side firms, and many platforms. The impact for these data-poor participants is a weak understanding of where costs are coming from, worse trading outcomes, and lower P&L.”
Hodgson notes that many firms have already realised the need to shift from “big” to “smart” data. “The spot FX market has become increasingly fragmented across multiple electronic and voice trading channels, creating a unique heterogeneity challenge when it comes to data,” he says. “Firms also need to be able to extract insight from a growing number of unstructured data sources in addition to their internal client data, including newsfeeds and other macro sources.”
This move towards unstructured data led to firms examining their ability to analyse data. Hopkins also points out that firms might not have an option when speaking of leveraging data. “Data sits at the core of everything we do within capital markets,” he says.
“It is a new technological race in which no business can afford to be left behind. This electronic evolution has driven margins ever leaner and there isn’t any alternative other than to employ data-driven decision-making to drive the business forward.”
Leaner trade workflows
Institutions have always benefitted from boosting their efficiency. Whether extracting more performance from their sales and trade desks or automating more portions of trading operations, efficiency boosts bottom lines. Data-driven intelligence plays a key role in this performance enhancement.
Traders and salespeople equipped with the right data can focus on the right clients and offer the right recommendations at the right time. Hopkins says, “There is too much wasted effort expended with uncertain outcomes e.g. adding a new counterparty. People accept the associated costs as the “cost of doing business” – but it really doesn’t need to be that way.”
“At FairXchange we want to equip LPs and LCs with tools to seamlessly and dynamically analyse and optimise their business. Salespeople should not have to create monthly reviews with a static set of information printed out anymore: All this can be done online by sharing anonymized screens of performance.”
But what does data-driven intelligence practically look like? Mosaics’ Hodgson paints a picture.
“Once a bank’s data has been aggregated and normalised, cutting-edge technologies such as complex statistical analysis, machine learning, and natural language processing technologies can then be applied, for example, to automatically produce highly personalised research reports,” he says. “And the end product is far superior to the typical research team’s ‘’buy” or ‘’sell” recommendation and produced with far greater efficiency.”
Hodgson also points out that technology augments the role a human being plays in this workflow. “The pairing of human-computer systems for approaching tasks boosts the intuition and creativity of the human mind with the power, brute force, and precision of a machine,” he says. “Armed with a comprehensive overview of their clients’ trading activity, they can then make informed decisions and appropriate recommendations.”
NCFX’s Lambert uses another analogy to describe where the market is going, and the role data will play in it. “We believe trading will become like a modern airliner where data and systems fly the plane most of the time,” he says. “Once data is clean, accurate and analytics are timely, then you can move towards allowing the system to drive the trading. This is the groundwork that we are putting in place for our clients now.”
Clearly, the potential for leveraging data-driven insights in FX trade workflows is huge. This begs the question: Do existing data platforms deliver this promise? Lambert believes there is still some way to go. “The focus of data platforms in foreign exchange until now has been to provide an intuitive and user-friendly experience,” he says, “instead of ensuring that the data is an accurate reflection of the costs being generated from foreign exchange activity.”
He further explains this focus was perhaps a result of data being used to fulfil regulatory requirements instead of driving pre-trade decisions. That picture is now changing, placing more emphasis on data accuracy. “The best systems must still be intuitive and user-friendly, but that is no longer the benchmark to measure against” he notes. “Now it is about the accuracy of the calculations and the timeliness of their application.”
As Hopkins says, “FX is also not a particularly complex asset class, and therefore, whilst the volume of data is large, the capture is relatively straightforward. The key then turns to displaying your data: Clean user-friendly interfaces, configurable filters, highly customizable, interactive and visually pleasing, fast and dynamic. You don’t have to be tech-savvy to use data: the better platforms should bring the data to life for you.”
Hodgson notes that while intuitiveness is important in a platform, it must offer additional value. “We’re used to a service like Netflix doing the legwork for us and suggesting options we will want to watch from a database of thousands,” he says. “The same AI and machine learning technology can be applied in financial markets to analyse vast amounts of data to enable salespeople to make bespoke recommendations to their clients of an appropriate trade to make.”
“The spot FX market has become increasingly fragmented across multiple electronic and voice trading channels, creating a unique heterogeneity challenge when it comes to data,”Matthew Hodgson
Steps to take toward data normalisation
Given the current state of affairs regarding data operations at most institutions, normalisation might pose a daunting challenge. Where should an institution begin if it wants to leverage analytics-based insights from its data?
“Leaders must not be stubborn in the face of change and focus on establishing a culture of analytics – which means first hiring the right people and deploying the right technology,” Hodgson says. “This often requires a total cultural overhaul for a bank where fact-based decision-making supersedes intuition and opinion.”
“The good news is that the bulk of the information needed for this transformation is already within the enterprise,” he continues, “so major and costly data acquisition is not needed.”
Lambert advises a step-by-step approach that breaks down an institution’s needs and examines the current state of data before beginning the normalisation process. “In a perfect world, what data would we want to capture, store, and analyse, how many data feeds do we receive, what are we trying to achieve, where is each component of the data we have stored, and can data in different storage be brought together for analysis?”
Following this, he recommends examining how data is used to deliver analysis and who has access to it. “The goals of a data policy will vary enormously between buy- and sell-sides, but of course,” he says, “the tools used by both are rooted in strong abilities to consume, store, and analyse data. The data plan depends on the goals of the firm, but NCFX can assist in guiding the client to the correct questions – and answers.”
FairXchange’s Hopkins notes that data is not the domain of a few and data democratisation should be a priority. He says questions like “..what data do I need and what questions (from sales/trading/strats) am I looking to answer? How do I present this data in a consumable and simple format?” are critical.
Once the initial foundation is in place, organisations can begin the normalisation process. Hodgson notes that enriching existing data sets is a vital step. “There is an important third and final step which is to enrich this normalised data set with additional fields of data that are missing from the transaction record,” he says.
“This is where external data sets can be employed to plug the gaps in a firm’s data. This includes using market data to enable market impact comparisons between the firm’s activity and the markets as a whole,” he continues, “but it can also include far more complex additions to the data set, such as introducing risk calculations onto the data record for cash or derivative trades.”
Hopkins notes, “One of the most important elements is accurate timestamping. This is a real challenge for manually booked trades, particularly voice trades. Some clients use several platforms to service their clients and the same trading counterparty may be represented in many different ways across those platforms.”
“It is essential to bring these together into a unified taxonomy so customers can understand what is happening at the more granular level of the stream and at the higher level with the liquidity provider as a whole.”
“You don’t have to be tech-savvy to use data: the better platforms should bring the data to life for you.”Guy Hopkins
Choosing the right service provider
Getting a grip on data is challenging for most institutions, and typically, most stakeholders discover they need special expertise to help them through the process. A few banks have tried transferring their equity market knowledge to FX when grappling with their data normalisation projects, but Lambert cautions that FX is a different world.
“FX markets have many features in common with equity markets, and many recent developments, particularly in analytics, come from equities,” he says. “This ignores some very important differences though, so a deep understanding of where an equity market calculation doesn’t help you in FX is crucial. Some key differences lie in fragmentation, credit, and swap prices.”
Hodgson points out that firms would rather have their quants analysing data rather than cleansing it, something a dedicated service provider can help with. “It has been estimated that quant researchers spend around 80% of their time engineering data to make it usable for quantitative analysis,” he says. “They need a platform with a fast time to market that empowers them to tip this balance, freeing up time for data science – not engineering and data cleansing.”
Like Lambert, he cautions against using generic platforms or those optimised for different markets. “While there are a number of generic analytics platforms on the market, these present a unique challenge for the nuanced world of financial markets,” he says. “They often require in-house resources and deep technical expertise – both to tailor the platform to the bank’s specific requirements and to keep it updated and relevant on an ongoing basis.”
He also points out that these platforms present a human resource risk. If the person responsible for running a platform leaves, the analytics arm can be rendered unusable. Lambert presents three priorities to evaluate in a potential service provider.
“First, look at how many data feeds it can capture. Next, check that the platform uses independent data instead of inaccurate or self-referencing data which risks biases getting embedded into your analysis. Lastly, check whether you’re looking at objective, normalised calculations that reflect costs and conditions in real time.”
“If you’re in the market for FX analytics, I think the questions you want to ask are who has built the platform and do they understand the nuances of the FX market such that the tool anticipates the questions/queries I want to run as a trading/salesperson,” says Hopkins. “Too much time can be lost explaining things you need and too much money is spent on custom builds vs. partnering with domain experts.”
Hodgson notes that evaluating a service provider’s success with other banks is a good way to estimate the value on offer. “There is, after all, comfort in knowing you are in good company,” he says. “I feel very proud to be able to say our product is now in daily use by leading investment banks, setting a new standard for the industry and delivering maximum ROI – both in terms of the trading outcome via increased hit rates and strengthened client relationships for salespeople.”
What market participants can look forward to
Technology is rapidly improving, and data platforms have kept pace with these changes.
Lambert notes that AI and ML are already present in the market but hindered by current processes – something he believes will quickly change. “The need for accurate calculations that consider current market conditions is crucial; the use of reliable benchmark data is crucial; the creation of results in real-time is crucial,” he states.
“If AL and ML are used with existing stale analysis and poor data then execution systems will be biased and unreliable. The common practice at present of allowing parties outside of the execution chain with a vested interest in seeing your individual trading data will become absurd in the context of AI-driven execution,” he continues. “Maximising data collection and its analysis will drive edges in execution across the market.”
Hopkins echoes these views and says, “There is a tendency to perceive AI and ML as silver bullets that offer many “answers” before people have even begun to formulate the questions they are seeking answers to. What these techniques can do very well is identify patterns in complex datasets that might not otherwise be visible to a human, and we do see several interesting areas of research to pursue in this regard.”
“It’s also worth bearing in mind,” he continues, “that the “explainability” of AI/ML models is key – The ability for customers to understand what the model is doing and why. If they are too opaque, you lose the transparency that is so crucial, and you are back to relying on trust (i.e. trusting the model to be correct.)”
Hodgson reckons the democratisation of analytics will continue, with lower-tier banks gaining access to these insights. “New technologies to support digitisation – such as AI, machine learning, and data analytics – are being delivered via the cloud and enabling tier 2 and 3 banks to compete with global tier 1 banks,” he says. “This democratisation will continue to reshape the financial services industry in 2023 and beyond.”
Thus, while the future is already here, we will continue to see innovative technology penetrate the market and possibly become the norm over the next few years.