Vivek Shankar

Next generation FX analytics: Bringing transparency and more to the FX execution process

November 2023 in Trading Operations

Electronification has steadily increased in FX over the years, and has spawned new stakeholder needs. Proving commercial viability and increasing efficiency are two examples that have led to data analysis becoming a critical portion of the FX execution workflow.

With more data at firms’ disposal than ever before TCA is playing an important role in helping them fine-tune their strategies. Guy Hopkins, CEO and Founder of FairXchange, believes this need underlies recent developments in execution analytics.

“Regulatory drivers such as MiFID II have certainly increased adoption of data analysis,” he says, “however there is a much deeper commercial need to understand the impact of technology on the execution process.”

Whilst Paul Lambert, Chief Executive Officer, New Change FX, adds that the shift towards algo execution has also had an effect. “The pandemic was key in driving adoption as traders were unable to use ‘old’ approaches to execution,” he says. “The realisation that passing an algo order is effectively the same as passing an ‘at best’ order to a bank has simultaneously driven the realisation that analysis is vital to ensure best execution. The risk for buyside traders that they might be making poor execution choices now poses a direct threat to the trader’s job. The effect of algos is to transfer market risk back to the client. It is therefore crucial that analysis supports an auditable decision-making process to manage that risk.”

Prepping for data analysis

Given the twin needs of justifying trade decisions and demonstrating their value to firms, data quality is playing a key role. Lambert lists a few different datasets traders look at from a high level. “Traders need to understand all aspects of the behaviour of their LPs,” he says. “The historical approach of simply adding LPs via an EMS is no longer adequate because each LP and their effect on the pool needs to be understood.” “Similarly, algo execution details like LP prioritisation and selection, reject rates, market impact, etc, need to be understood at a granular level rather than on a generalised, post-trade basis,” he continues. “The most important aspect remains the recording of a decision-making process that justifies execution methodology using real-time data. Static TCA is of little use in that process.”

“It is important to differentiate between independent analysis and independent data”

Guy Hopkins

Meanwhile, Oleg Shevelenko, FX Product Manager, Bloomberg, echoes Lambert’s point about algos breeding new data analysis needs. “Automation and algorithmic trading require participants to have a much deeper understanding of their execution performance and how it can be incorporated into execution strategies that realise the most benefit,” he says. “Unbiased data is fundamental to the objective evaluation of trading performance,” he continues. “Liquidity providers are turning to third-party independent data providers to help them demonstrate the quality of their execution algorithms to their clients, while clients are also looking for independent data to benchmark their execution decisions and showcase their value to investors.”

James Knoop, FX Back Office Solution Specialist at ION, believes independent data is critical for banks. “Banks having their own independent data is critical so that they have leverage when interacting with their LPs,” he says. “Being able to highlight areas where LPs are struggling against their peers or being deliberately toxic with their flow can lead to a closer collaboration between Bank and LP. This in turn improves their execution and profitability.” He cites a situation as an example. “Toxic flow from clients or toxic actions for LPs such as long last look times or quote spamming can impact the bank’s trade execution outcomes. Having access to data to identify these scenarios enables bank’s to make informed decisions about how to price clients and interact with their LPs.”

Lambert stresses the importance of the unbiased standard or ruler to ensure the data firms receive is independent and lacks biases. “If the ruler isn’t independent, then you cannot trust your results to be objective,” he says. “For example, if you are using data from inside your trading ecosystem when LPs are skewing their prices, as they often will to reflect their trading view, your ruler is changing, and you will embed bias. For unbiased measurement, you need to use an exchange rate that your activity isn’t affecting.”


FX execution workflow resembles a conveyor belt where orders are moving through various stages

Phil Morris, CEO of Reactive Markets, is extremely familiar with these biases and the importance of ensuring high quality independent data. When asked about the effects of independent data on the kind of analysis liquidity takers can perform, he lists a scenario. “A liquidity taker may have access to vast amounts of market data from their LPs on a specific platform,” he says. “This allows them to analyse the relative pricing and execution quality between these participants. It does not answer questions about how their LP pricing may differ on other trading platforms, or how a new LP may be able to change the shape of their liquidity pool. Adding independent data can open up insights into how their LPs or platform providers are performing relative to the wider market.”

“Similarly,” he continues, “an LP will only see its trades with a specific liquidity taker and will have no context about its relative performance or what improvements it needs to make to win more business. By accessing independent anonymised datasets, an LP can proactively optimise their pricing on a client-by-client basis resulting in better outcomes for both the client and the LP. At Reactive Markets, we offer complementary liquidity management and data services tools as a core part of our offering,” he says. “On client request, we capture and deliver their dataset to several specialist data and analytics companies where they can analyse this alongside their larger, independent datasets.”

Lambert explains how New Change FX assists its clients. “The NCFX mid-feed is designed to be the unbiased ruler because it is not part of a trading platform,” he says. “The possibilities for measurement expand enormously once you can triangulate your own available spread and midrate, your micro price, your available volume, and prevailing market conditions (volatility, update frequency, etc.) To measure your outcomes, you need the data from inside your system and measure them against the independent benchmark from outside the system. It is from this information that the trader can create an execution methodology for each trade.”

“The historical approach of simply adding LPs via an EMS is no longer adequate because each LP and their effect on the pool needs to be understood.”

Paul Lambert

FairXchange’s Hopkins acknowledges the importance of independent data sources but stresses that analysis is just as critical. “It is important to differentiate between independent analysis and independent data,” he says. “Independent data analytics firms like FairXchange provide objective, neutral analysis of trading firms’ data. This gives firms valuable insights into their trading business that they might otherwise miss, and it also removes the perception of “marking your homework”. Indeed certain types of analysis only make sense when based on a particular firm’s data – their unique liquidity, for example,” he states.

When does incorporating a benchmark or ruler (as Lambert explained) hold water? “There are times when it is helpful to combine this with independent data sources,” Hopkins responds. “Particularly when the analysis is intended for third parties, such as clients, counterparties, or regulators. This gives a useful degree of standardisation, and can remove the potential for a given firm’s trading activity to leave a signature on the reference data against which trades are measured. It is also helpful for creating a level playing field, using objective data that counterparties can agree on.”

“Being able to highlight areas where LPs are struggling against their peers or being deliberately toxic with their flow can lead to a closer collaboration between Bank and LP.”

James Knoop

Combining data with analytics tools for more transparency

“Data alone is quite useless without analytics,” Bloomberg’s Shevelenko says, “as analytics are a vehicle that make sense of your data and uncover meaningful conclusions. Therefore, data and analytics go hand in hand helping to constantly reshape the execution process, making it clearer and more transparent.”

Hopkins believes data standardisation is a critical first step firms must invest time into. “Standardise the data into a format that permits firms to compare across different venues, trading platforms, and counterparties,” he says. “The analytics then need to be powerful enough to allow firms to quickly identify areas of interest, in real time if possible. This now extends to using AI to detect anomalies or issues that require attention.”

He also points out that these analytics must be accessible to as broad a constituency of users as possible. “Data and data analysis are now part of everyone’s daily working life, it is no longer the province of a small number of highly qualified specialists,” he notes.

Lambert counters that firms must look earlier in the cycle and focus on sourcing enough data. “Almost no buyside users can consume and analyse their available liquidity,” he says. “They are shooting in the dark without the data to understand their feeds and act accordingly. Most analysis available today is based on ‘dead’ data, which offers little in terms of live execution problems.”

“Data and analytics go hand in hand helping to constantly reshape the execution process, making it clearer and more transparent.”

Oleg Shevelenko

He explains that using such data with algos doesn’t make sense. “The use of a historic database of algo executions to guess which algo to use now is like driving from A to B using last week’s traffic conditions when roadworks may have moved, and the weather may have changed. This challenge is what led New Change to build its Data Processing as a Service (DPaaS) offering. NCFX, under an ordinary platform agreement, can take feeds from banks and process them to produce live analytics for its clients. Clients can create their live midrate, see how a trade or algo execution with a bank affects the prices at every other institution, and triangulate their midrate with the independent midrate from NCFX and their own micro-price.” Lambert adds that while these feeds are built on live data, clients can store their data feeds with New Change FX and run historical analyses on them.

TCA, LPA, and moving data from post to pre-trade

Lambert’s comments highlight the effort solutions providers are undertaking to move data from a post-trade environment to a pre-trade one. As Lambert puts it, “It is no longer adequate to base trading choices purely on historical data. Nor is it acceptable to take an LP’s word for what their products can do. A complete record of execution decisions requires that all available pricing is available, analysed, and understood. There are lessons in yesterday’s outcomes, but without live feedback and the contextualisation of current and dynamically changing conditions, we are trying to optimise with only half of the picture.”

Shevelenko notes that this move is critical, and the breadth of post-trade datasets impacts signal reliability. “The link between post-trade data and pre-trade analytics for decision-making is very important,” he says. “Trading parameters such as the choice of liquidity providers for a given instrument, optimal number of liquidity providers in RFQ, or the choice of the algorithmic trading strategy are now available. More extensive analysis of post-trade data can also reveal the impact of updating algo parameters while the order is in flight versus leaving the default ones for the duration of an algo.” He adds that platforms offering peer analytics are extremely valuable, given this link.

FairXchange’s Hopkins notes that while the shift in data is promising, the implications for real-time decision-making driven by human traders are uncertain. “With the rapid progress of AI,” he says, “humans might step back from the decision process at the actual point of execution, and instead transition to becoming real-time risk managers, with oversight over a suite of automated execution tools that are responding seamlessly to changes in the market.”

“The big market makers have been doing this for many years,” he adds, “so it would not be a surprise to see this more widely adopted on the buy-side. This has important ramifications for how analytics develop.”

“TCA in combination with LPA gives a much more holistic view on the trade performance of counterparties.”

John McGrath

Analytics are central to TCA, but the industry has witnessed a lot of talk about a move to LPA or Liquidity Provision Analytics recently. How valid is this chatter, and is TCA analysis evolving faster than expected? John McGrath, Chief Revenue Officer of BidFX, thinks otherwise. “BidFX termed the phrase ‘LPA’ when we started to develop our Liquidity Provision Analytics feature to give clients the ability to start evaluating their counterpart selection based on the wealth of data we could provide whether that be average LP spreads, skews, TOB, and market impact,” he says. “I wouldn’t say people are moving away from TCA. More that in combination with LPA, it gives a much more holistic view on the trade performance of counterparties. Clients now want to be able to affect their LP selection in flight based on a feedback loop on their LPA.”

Lambert agrees with McGrath’s views. “At present, there is still plenty of analysis to be done in the TCA space,” he says. “The key to getting the best outcome is to use all the relevant information, and we believe that does indeed mean using live analytics around LPA, but it is still important to understand how trading choices have performed historically and how the method of execution was affected by the conditions in the market.”

Meanwhile, Paul Liew, Head of Liquidity Management at TradAir, an ION company, notes that quantitative metrics are giving traders a great view on the execution impact of their decisions. “It’s difficult to put a monetary cost on rejected orders as there is no guarantee that another LP would have filled them, especially in a volatile environment,” he says. “However, reject ratios, response time and spread analysis can be easily measured. It has traditionally been the role of the Liquidity Manager to look at these numbers in a holistic way and make some sense of them.”

“A low fill ratio doesn’t immediately mean that an LP is problematic, when the orders arrive only at the last minute just before the quote is refreshed. TCA will still be an important tool as PnL impact is more easily calculated.”

Shevelenko thinks the utility of LPA analytics is high enough for firms to demand them as default parameters in execution platforms, something Bloomberg is acting on. “Recently, Bloomberg released a new suite of FX pricing quality tools that allow price takers to investigate how often a counterparty priced and won the trade, were runner up with the “Best Alternative” price or placed somewhere in the pack,” he says. “Price takers can also measure how often a counterparty declined to price, failed to pick up the request, or rejected a request to deal. Using the same analytical toolkit, price makers can quickly identify when clients traded away, or where opportunities to price are being missed and why, such as issues with internal counterparty setup, enablement issues, or internal credit rejects.”

Removing opacity in execution cost chains

FX execution cost chains have plenty of hidden costs within them, especially when evaluating the impact of counterparty liquidity and market volatility. Stakeholders have long been interested in leveraging execution data to dig deeper into them.

“By accessing independent anonymised datasets, an LP can proactively optimise their pricing on a client-by-client basis resulting in better outcomes for both the client and the LP.”

Phil Morris

McGrath believes the pieces are in place for firms to break down this opacity. “Cost chains are complex, but with advanced data collation and a deep understanding of customer workflow, we can now start to address these underlying components across the whole trade process,” he says.

He explains that BidFX developed a “Best Value” suite to allow the buyside to factor in costs in real-time. “This has now been developed to allow clients to start planning how they factor in counterparty selection to the trade process via a feedback loop.”

Shevelenko echoes these views. “FX execution workflow resembles a conveyor belt where orders are moving through various stages such as creation, validation, compliance checks, eligible counterparty assignments, netting, and optimization,” he says. “Aggregated analysis over a representative timeframe could suggest various actionable enhancements to the workflow and trading process.”

Hopkins offers a few examples of the questions execution analytics can answer now. “What is the economic impact of a valued counterparty terminating a relationship?,” he says. “How much does it cost to establish a relationship that is capable of filling the gap? Which liquidity providers should a firm be trading with? These are questions that firms have been wrestling with since trading began, but only now are technologies emerging that can start to help firms answer them on a systematic basis.”

New Change FX’s Lambert cautions that examining the context behind the data is critical. “By providing the ability to capture, analyse and store the live price data direct from the source, without knowing what the EMS does, there is only a limited set of conclusions that a trader can reach,” he says. “Only the most obvious and egregious costs can be spotted, and the refinement that exists in the detail is lost. We do not see others putting in place the necessary foundations to make unbiased, informed, and timely trading decisions.”

A role for AI and ML

Can AI and ML play a role in bringing context and simplifying analysis here? Lambert says yes, but with a few caveats. “Their success will rest on the quality of the data that is available to them,” he says. “We know that with a model it is always a case of garbage in garbage out, and that is true with AI and ML too.”
Hopkins agrees and adds more nuance. “It is important to recognise that AI is not a silver bullet,” he says. “The point of execution analysis is to provide insight and transparency. Deploying black-box algorithms to the analysis process that no one understands introduces an “explainability” problem. This moves the opacity from the trading process to the analysis process, which doesn’t solve anything.”

The role of the Liquidity Manager can be automated to a certain extent by AI

However, he concedes that AI and ML have highly exciting use cases. “It is an area that FairXchange has invested in significantly,” he says. “Guided by appropriate experience and domain expertise, and based on robust, standardised data and high-performance infrastructure, there is a huge opportunity in this area.” He cites an example. “One area that is getting a lot of focus currently is AI-driven alerting, informing users about things happening in their trading business that they might not otherwise be aware of. With the huge amount of data available individuals cannot check all potential factors that might impact their business. AI will play an essential role in pointing them to the issues that need attention.”
BidFX’s McGrath talks about a few initiatives. “We already have a team working on this developing product at BidFX and although the industry is still in the early phase of rolling out these features there could be some real efficiency and quantifiable benefits for the sellside and buyside in how MIS, data, and analytics are accessed and actioned upon,” he says.

ION’s Liew adds, “The role of the Liquidity Manager can be automated to a certain extent by AI, such as finding optimum combinations of Market Takers, Liquidity Providers, and Currency Pairs quickly. This can be a tedious manual process.”

Collaboration and the evolution of FX analytics

While solutions providers are offering innovative solutions, they’re hampered by the sheer number of touchpoints in the execution workflow. Reactive Markets’ Morris lays out the issue. “Market participants often have many market-facing execution touch points within a given workflow,” he says, “whether an execution protocol (ESP, RFQ or Algo,) or platform being used. Consolidating that into a single, normalised view of the world can be complex, with specific benchmarks and analytical requirements by each individual counterparty.”

“It’s difficult to put a monetary cost on rejected orders as there is no guarantee that another LP would have filled them, especially in a volatile environment,”

Paul Liew

Collaboration and integration between platforms is the best way to solve this roadblock, Morris says. “By collaborating and having connectivity to the leading analytics and TCA providers in the market, clients have the flexibility and opportunity to automate much of their data and algo analytics while we complement the workflow by providing leading execution performance and connectivity to liquidity providers. This superset of independent data from a variety of sources allows clients to make informed, data-driven decisions, something which we encourage and actively facilitate in providing our clients access to.”

Hopkins adds to this view. “Without collaboration through data, firms are solely restricted to their own view of the world,” he says. “FairXchange was conceived purely to facilitate collaborative dialogue between trading counterparties. It is even the inspiration of our name – the fair exchange of data between trading firms to arrive at the optimal mutual outcome.” 

Lambert thinks our daily conditioning to data readily available on smartphones makes it obvious that FX analytics providers will face similar demands. “The most popular ecosystem will be the one that provides its users with the best experience by offering the most choice and highest quality of applications to manage the FX workflow from end-to-end,” he says. “That means bringing together all required components. Each part of the workflow demands different attributes, whether its technology, liquidity provision, independence, or analytical power, and we believe that no single provider can be all things to all.”

Moving forward, Shevelenko thinks integration and collaboration’s benefits are too obvious to ignore. “As resource constraints and cost pressures continue to present challenges for the industry, the platforms offering front-to-back execution and analytics services are likely to continue to gain traction with clients,” he says. He believes these conditions will create a virtuous circle where platforms and LPs come to rely on each other, pointing to how the execution analytics space will evolve. “Advances in technology are going to continue to drive innovation including data and analytics,” he says. “Platforms are going to further rely on liquidity providers to share aggregate data sets to power analytics. Liquidity providers are likely to leverage platforms for independent and objective evaluation of their liquidity and algorithms.”
Ultimately, every trader aims to reduce market impact from their trading. Execution analytics are quickly evolving, and with more AI use cases emerging, traders do not have to fly blind.