By Vivek Shankar

Putting data at the heart of FICC business strategy

January 2024 in Data Management

Fragmentation. Regulatory changes. Volatility. It’s safe to say the FX markets have posed significant challenges to market participants. Firms have long believed in the power of harnessing data to help them achieve higher levels of efficiency but have experienced a few hurdles.

“Building and running a data pipeline is complex and expensive, especially at scale and with the performance levels that a trading firm needs,” says Stephen Totten, Director of Quantitative Analysis at oneZero. “Firms may have systems in place to capture data, but to use that data in a consistent and repetitive process means that they must be able to deal with outages, backfill missing or erroneous data, normalise data from multiple sources, and flag and clean any outliers.” 

Matthew Hodgson, Founder and CEO of Mosaic Smart Data, points to another issue. “Firms also need to extract insights from a growing number of unstructured data sources in addition to their internal client data, including newsfeeds and other macro sources.”

“Against this backdrop, many FX trading firms struggle to unlock the value in their data and transform it into actionable insights.” Is the data challenge a step too far for FX market participants? Well, not quite. While challenges exist, technology is evolving to offer novel solutions.

Data implementation challenges and timelines

Institutions draw data from several sources, and poor data organisation is creating silos that complicate analysis. “Firms trade with counterparties on multiple different channels- Single bank platforms, APIs, multi-dealer venues, and ECNs,” says Tim Cartledge, Chief Data Officer at Tradefeedr. “The result is that trading data is fragmented and overly complex. An independent trading data network provides a single, consistent view of trading data for all parties-Liquidity Providers, the buy-side, and trading platforms.”

“Building and running a data pipeline is complex and expensive, especially at scale and with the performance levels that a trading firm needs,”

Stephen Totten

This network offers several benefits for all stakeholders, he explains. “LPs can demonstrate the quality of client pricing at review meetings and use analytics to optimise pricing engines and trading algos.”

“Buy-side traders can make data-driven trading decisions that cut costs and maximise opportunities, while also proving best execution.” Cartledge points to Tradefeedr’s Algo Forecasting Model as an example of data driving productivity.

Buy-side traders leverage analytics to choose the most appropriate algo for a trade, and LPs can refine algos by analysing their performance.

Totten also points to data’s benefits when monitoring client behaviour and using analysis conclusions to increase market share. “If a client is increasing activity in a currency, or changing their hedging policy,” he explains, “sales can engage with the customer to understand what’s changed and recommend a better solution.”

“Secondly, sales can feed that information to trading desks, helping them adapt pricing and risk management to service the new behaviour optimally. oneZero’s classification model for client flow management, for example, continuously monitors client trades and flags changes in trading profiles, helping firms proactively engage with their customers and rapidly react to changes.”


Banks are looking to unleash AI and machine learning to take their data analytics to the next level

Hodgson says a laser-like focus on productivity has been the key for firms that have navigated challenging market conditions this year. “Extracting insight from data is a critical step in enabling banks to do more with less,” he says. 

“To put this into context, following a recent pilot with Mosaic, a tier 1 FX bank reported a 20% increase in call volumes, 22% increase in call duration, 18% increase in enquiries – and 100% of its users wanted to move to production immediately because of the extent to which they were able to enhance their productivity.”

He’s quick to point out that such results need some foundational work. “It’s not quite as simple as flicking a switch and suddenly extracting actionable insight from your data.”

“Firms need to start with a solid foundation of properly aggregated, normalised, and enriched transaction and market data. From this solid foundation, advanced analytics and AI tools can then begin to deliver value, providing insights across the organisation.”

Cartledge explains that data normalisation used to be expensive, but data solutions providers now offer this as standard. “Also, there has been a view that firms need to hire their own data analysts to get value from the data,” he continues. 

“Again, this is not the case, some of our larger clients build their analytics using our data API, but many more use our pre-built reports to quickly get value from the data, like TCA reports.”

Totten points out that conducting a normalisation exercise might cause trading firms more problems than it solves. “The FX markets have a wide range of trading venues and there can be considerable variation in protocols, as well as volume, depth, and quality of data, between different liquidity providers,” he says.

“There is also a risk that in normalising the data the firm loses some key information from feeds, so a deep understanding of different venues and their protocols is very important.” He recommends working with a market-neutral technology vendor with prior expertise.

Hodgson notes that data normalisation exercises tend to spiral out of control due to errors in the decision-making process. In turn, these errors increase costs and stretch implementation timelines unreasonably.

“Surprisingly, many of these decisions have historically been based on emotion,” he says. “Decision-makers choose what ‘feels right,’ or rely on past experiences working with unrelated vendors on a different requirement. Ideally, decisions would be based on the key criteria of cost, efficiency, productivity, and speed to market, delivered through the lens of specialised capital markets expertise.”

“Extracting insight from data is a critical step in enabling banks to do more with less,”

Matthew Hodgson

Totten and Cartledge agree with Hodgson that implementing a data solution in-house might lead to unreasonable timelines and costs. “As such it’s very important for firms to think carefully about what they want to build in-house,” Totten says. “Where they may have significant IP and what they want to buy to speed up that time of getting to market.”

Sifting through service providers

So what can trading firms and banks expect from the latest generation of data solutions providers? Hodgson lists a few benefits. “A bank can achieve a single, holistic view of FX transaction data across all its global locations, across all trading channels, and all asset classes, combined with the relevant market data,” he says.

This will help them unleash AI and machine learning to take their analytics to the next level, he adds. “Banks are now beginning to use AI to look at each client as their own segment, and hyper-personalise the insights and service they provide them,” Hodgson says.

“This is optimised with natural language generation technology, which delivers reports such as multi-asset morning briefings in a human tone of voice with easy-to-interpret analytics. The result for the bank is increased loyalty and a greater share of mind amongst clients.”

Cartledge says Tradefeedr customised its offering based on client needs. “Many new clients start with standard analytical reports which auto-populate, then progress to a slightly more advanced and bespoke toolkit where users can select different tools or widgets to build their specific reports,” he says. “Our service is also available as an API feed for clients to capture our data directly into their systems to analyse the data themselves.”

“oneZero has been working intensively in the data space for many years,” Totten says, “and we are now on our 4th-generation platform. Every iteration is improving on the last, and building extensive feedback into the next.”

“Technology solution partners also need to be able to capture ever more data and devote more and more computing power to its analysis,” he adds, “which in practice now means access to significant cloud-based functionality.”

“oneZero’s Data Source is a next-generation cloud-based intelligence toolkit that does just that, by capturing and modelling quote and trade data for our clients – their Data Source DNA as we call it.”

All three point to experience as a critical factor when choosing a service provider to work with. “We partnered with the European Space Agency back in 2018 to explore how AI algorithms developed for use in space could be adapted and deployed in capital markets,” Hodgson says. 

“Another key point is that when you are using data analytics to prove Best Execution, independence matters.”

Tim Cartledge

“Fast forward to today, and this technology is in real-world deployment in live capital markets by a number of our customers who were able to benefit instantly from this cutting-edge technology that originated in a completely different field.”

Totten stresses oneZero’s vast expertise too. “Deep industry knowledge combined with our expert technology and operations teams means that we are delivering solutions that add significant value for our clients,” he says. “We have a growing proportion of team members out of our total of over 170 staff who have wide-ranging experience from brokerages, banks, exchanges, and prime brokerages.”

Cartledge says working with experienced service providers helps firms squeeze more out of their data and cites an example.”We would not have been able to launch algo forecasting without detailed knowledge of the FX market and specifically FX execution algos,” he says. “Another key point is that when you are using data analytics to prove Best Execution, independence matters.”

Where are data services headed next?

AI and ML have hogged headlines this year, and Totten, Cartledge, and Hodgson are excited about its potential. When asked about future developments all three identify unleashing more AI use cases as the next frontier.

“We are already using ML in our algo forecasting model and its use will continue to grow,” Cartledge says. Totten concurs and says that as AI models become more sophisticated, trading firms can expect huge results.

“One of our models analyses client flows and markouts, for example, across millions of trades a day,” Totten says. “This analysis enables a more scalable and efficient understanding of trading styles and preferences, with the ability to segment information to ensure that FX flows are priced and risk managed optimally.”

But he’s quick to caution that greater AI sophistication brings firms back to a central issue – Data volumes and management.

“The more sophisticated the model, the more data is needed to train these to fully capture all of the market dynamics,” Totten continues, “and this is a key area where almost everyone outside the very largest institutions could struggle.”

Hodgson says the markets will soon reach a stage where AI and ML-driven analytics will become table stakes. “Forward-thinking tech players continue to build on this innovation and shape the future of the industry for the years ahead,” he says.

“There is never a good or bad time to be innovative, but in the current environment, all innovation should be laser-focused on helping achieve the business’s KPIs,” he adds. “Technology platforms will be judged by their ability to impact a bank’s bottom line without breaking the bank to deploy in the first place, and their strength when it comes to supporting change management.”

Cartledge looks beyond AI and identifies another critical area service providers must develop. “Perhaps as important is data analytics services supporting cross-asset trading,” he says. “With this in mind, we are currently adding Futures and Equities to the Tradefeedr platform.” 

“It sounds obvious, but the value of FX data analytics comes from a combination of how much of the market is captured,” he continues, “the quality of the data, and the tools provided. The technology and tools are important, but equally so is the network.”

While adopting data pipelines to boost trade efficiency is always challenging, firms have no choice but to tackle this complexity. Partnering with the right service provider is critical, of course. 

However, committing to build processes that extract the most from data is perhaps the most critical element in this picture.