There are both macro and market structure trends putting buy-side firms under pressure to better understand their execution performance and whether they are doing everything possible to achieve the best outcomes, says James Singleton, chairman and CEO at Cürex, the US-based provider of FX execution analytics.
“The obvious macro trend is the market volatility that has suddenly shocked many FX traders, many of whom have never dealt with these market swings in their careers. This volatility has put pressure on traders to try to understand the market’s condition before they execute their trades. Now execution timing and speed matters. Should they use an algo? If so, what variety? Or should they simple execute via risk transfer. All of a sudden, making the wrong decision costs real money,” says Singleton.
When it comes to market structure trends, the increasing use of algos, when selected as the trading strategy, introduces another level of concern for the buy side if they care about performance and best execution outcomes, says Singleton. “Every bank offers its own family of algo strategies. The first obvious consideration is which algo best matches the buy side trader’s trading objective. Does the trader understand how the algo’s intelligence actually works? If the answer to that question is ‘yes’, the next set of questions is critically important. Does the buy side trader know how the bank algo interacts with both the internal and external liquidity pools chosen by the bank? If the trade is facing internal pools, what different bank parties are in that pool – opposite buy side interest only; their spot desk; their prop desk; or all three and which receives priority? Does the buy side trader know when they should get filled and if that price is the same price that the bank gets filled internally? When the trade is filled from internal liquidity, is the bank capturing spread in addition to the algo fee it is earning? And most importantly, the FX trader should ask about the timing of the fills and whether his or her orders are even placed on external venues that he or she has chosen, especially if those venues have prices that could be used to execute the algo order more quickly and at a better rate. If the buy side cares about performance and best execution in this new, volatile and riskier environment, they need to ask a lot more questions and have access to more granular data,” says Singleton.
“Analytics providers have created tools to help the buy side FX trader make better execution choices”James Singleton
Not only does the data need to be granular, it also needs to be independent, says Singleton. “The first reason is that each bank chooses its own benchmarks to measure the performance of their executions. As we’ve said before, that’s like asking your barber if he gave you a good haircut. To assess trading results, there should be an independent benchmark that cannot be manipulated. At Cürex, we established such a benchmark with FTSE Russell many years ago. Obviously such an independent and transparent benchmark will show when the barber did not do such a good job. But it also shows when he or she did. Unfortunately, important market players have decided that using their own benchmarks is more useful for fairly obvious reasons. The second reason that the trade data given back to the buy side is not of much use is that the sample size is small and limited. To get real insights and deliver value to the buy side, they need to see peer trading outcomes in the currencies they executed. They need to see comparative bank algo performance as well. And they need that all set out against the precise market conditions that existed before they began to execute their trading strategies. At Cürex, our Cipher platform does all of this, and our growing data pool of anonymized, buy side trading results is allowing us to provide insights to both the buy side and to the bank algo providers that allow them to make better trading decisions and understand how to maximize their performance, respectively,” says Singleton.
The objective of bringing more transparency to FX has been given considerable lip service in the last decade, says Singleton. “Not all parties in the FX market have been on the same page when it comes to that objective. Of course, there is much more data available now because the market place has advanced remarkably to a nearly fully electronic state. Analytics providers have created tools to help the buy side FX trader make better execution choices. But to be effective and more transparent, the granular data that is used for these advanced analytics has to be real time and supported by the simultaneous access to market conditions for all currency pairs, bank algo platforms and the variety of algo strategies. This type of streaming analytics provides immediate to-dos for the buy side that will improve outcomes.”
There is a new generation of real-time, actionable pre and post-trade analytical toolsets but they belong to a very small community of providers, says Singleton. “For these tools to have real value, they need to access a very large pool of market and trade data. That need is challenged by the market’s structure and confidentiality roadblocks which are the legacy of the FX marketplace. As we all know, analytic tools are only as good as the data they rely on. And for these tools to be impactful, they need to be accessible in real time. FX traders should be able to change strategies on the fly. Or move to different counterparties based on performance that is available immediately post-trade. The return of volatility has and will continue to put increased pressure on the buy side FX trader to look deeper into the market environment before they execute their trading needs. That trader needs independent, transparent and reliable tools to make better trading decisions,” he states.
The jury is still out on the role of new technology like artificial intelligence and machine learning in improving FX execution analysis still further. But we are seeing a number of collaborative industry partnerships forming in a bid to meet the same objective.
“At Cürex, we have built our analytics platform by taking our clients’ trade data, analyzing it and then showing them how much money they made or lost as a result of their trading choices. It is that simple and that compelling. Once the clients see the value, they of course want more – and more. Their interest drives cooperative relationships with their counterparties since the banks are naturally interested in our analysis as well,” says Singleton.
“Bank algo providers want to understand how they perform compared to their competitors when it comes to our buy side clients. This exchange of information creates natural collaboration that allows access to more data. We recently announced a collaboration with FlexTrade where they will provide access to our Cipher platform to their customers through their EMS. Hopefully this collaboration will help us grow our data pool by adding customers through this leveraged distribution model.”
Regulation continues to be the main driver for the use of execution analytics, according to Yangling Li, head quant at BestX, a UK-based provider of TCA and best execution data that was formed in 2016 and then acquired by State Street in 2018. “Regulation such as MiFID II has naturally driven the demand for our services,” says Li. “There is a need to prove execution performance.”
However the best execution requirements that came from MiFID has triggered a general need for asset managers to demonstrate their value to end investors. “The outflows from active to passive investment funds have put asset managers under pressure to prove their performance and justify their fees and a penny saved is a penny earned. So you need post-trade analytics. If you cannot measure it, you cannot improve it. It naturally drives the need for TCA and other execution analytics. Once you understand the drivers, the different counterparties and algos, then you understand the reasons behind the performance.”
“Real-time data is the link between pre and post trade. It creates an information loop around execution that we can then apply to trading strategy,”Yangling Li
One development in execution analytics has been the growth of peer analysis where buy-side firms are able to exploit not just their own trading data but that of their peers. “At BestX we have an anonymised and aggregated community data pool which buy-side clients can opt into,” says Li. “Everybody contributes their own data and that drives user experience. Post-trade is only your data. There is also market momentum that drives decisions. It forms a complete cycle.”
Traditionally FX data is OTC which means that the two parties in a bilateral trade only get a partial insight, says Li. “For example, in terms of algo selection, it can be difficult to compare because banks use different metrics. What we have done with the opt-in pool is develop a tool that firms can evaluate the best-performing algos for different benchmarks based on peers’ data.”
It is the power of common data and uniform actions, says Li. “You benefit from your own data and market knowledge. Bank A and Bank B could not communicate with each other. We use our expected cost models to measure alternatives. We take raw algo data and price-stamps. And then calculate the benchmarks ourselves. The raw input is the same and the calculations are done in-house. The questions we want to answer is the final total cost to the buy-side.”
Again, it comes back to the maxim of ‘if you cannot measure it, you cannot manage it’. Li takes the example of custody banks before there was the use of timestamps. The emergence of WAP algos based on daily timestamp data plus additional pressure from regulators means that now most custodians use accurate timestamps. “We found this enabled firms to accurately compare spreads and that naturally lead to a lowering of the spread and more communication and engagement between the sell-side and the buy-side,” says Li.
There are two main factors to consider in execution analysis, says Li. The first is related to market factors such as timing, volatility and conditions. For example, if you execute a trade at a time when liquidity is low, then you will likely get a poor performance. An expected cost model will capture all of the information. The second side of the equation is related more to the business – the choice of counterparty and whether that is an ECN, single bank or a panel of bank.
Ultimately, says Li, it is about looking beyond transaction cost and more to execution analysis in order to get a quantitative view of your trading.
BestX is currently in the process of revamping its pre-trade analytics, says Li. This involves the use of real-time market region calculations around momentum and algo performance as well as other metrics. “We want to tie the post-trade with the pre-trade,” says Li. “Real-time data is the link between pre and post trade. It creates an information loop around execution that we can then apply to trading strategy.”
Liquidity provision analytics
One notable development in the use of execution analytics has been the greater use of liquidity provision analytics (LPA). It has been described in some quarters as a shift away from the use of TCA to LPA.
“The market has gone through a few phases when it comes to execution analysis – an emphasis on fill-rates and the cost of rejection used to suffice. Now much more is required”Guy Hopkins
According to Guy Hopkins, the CEO and founder of FairXchange, LPA and TCA are adjacent. “They both involve looking at trades and pricing but they have fundamentally different objectives. TCA is seen as more regulatory-driven where LPA has more commercial motives. TCA has its origins in MiFID compliance. It is there to give asset managers an answer to investors’ questions over best execution,” says Hopkins.
Trends are forcing FX traders to get a deeper understanding of the FX process, says Hopkins. “Prior to 2008, liquidity was primarily a volume game. Liquidity providers would aim to get as much business as possible. It was quantity over quality. That fundamentally changed after 2008 and the financial crisis.
“Banks started to assess their clients much more. People would look at their internalisation data much more to get a better understanding of the quality of the flow – what worked and what didn’t. Liquidity providers started to offer this data to clients but too often those clients found themselves with all this data from their LPs in different formats that was impossible to normalise and to effectively compare,” says Hopkins.
The lack of standardisation and the need to provide some commonality for all of this liquidity data and has been one of the foremost trends in the FX market and was enough to inspire Hopkins to establish FairXchange. “We aim to pre-arm traders ahead of any discussions with their liquidity providers,” says Hopkins.
“In many cases the firms having the conversation with the liquidity providers are not the originators of the trade, they are just intermediaries. So this service is important to create a sustainable trading relationship right across the liquidity chain. Liquidity management teams need to identify opportunities for liquidity providers to better monetise the flow they compete , which will ultimately benefit the end clients. Active management is needed,” says Hopkins.
The ambition for Hopkins and FairXchange is to forge a common language about liquidity and performance and impact, and a common set of metrics that the discussions between liquidity providers and takers are not contentious. It is also a change in the balance of power, says Hopkins, given that every liquidity provider wants their clients to look at this data.
“This makes for a more sustainable trading relationship,” says Hopkins. “It is a high volume, small margin business and firms should be able to do something about it before you get the phone-call switching you off. There are considerable costs and work involved in establishing a new trading relationship with KYC compliance, ISDA agreements, API connections. If you then lose that relationship or if it’s terminated, that’s money lost on both sides, so it is really important to try to make it work.”
Previously, when it came to assessing liquidity providers, market share was everything, however the conversations have since moved on to cover many more factors such as rejection rates, last look, spread and market impact. “It gets quite complex quite quickly so what we do is try and make this as easy as possible,” says Hopkins.
Execution analysis also used to be very quantitative and engagement with the data would be challenging for anyone that wasn’t a quant, however this has fundamentally changed, says Hopkins. “The world is more data-literate now including sales teams and traders. The data still needs to be presented in a way that lay-people can understand and with no black boxes so that it is transparent in terms of how the data is generated.”
So what factors may affect the quality of liquidity? In Hopkins’ view, the key to execution analysis is to allow people to understand what their business should look like, then ask questions as to what makes the flow look the way it does. “It might involve pushing data they aren’t aware of – meaningful anomalies or changes in activity specific to one entity. It is pointing people to look for changes in behaviour,” says Hopkins.
“The market has gone through a few phases when it comes to execution analysis – an emphasis on market share and fill-rates used to suffice. Now much more is required, such as participation rates and what affects them,” says Hopkins.
“You have to remove the areas of opacity so that you are eliminating the guesswork and making for more a nuanced discussion. It is a dialogue now and that is the big change. It is no longer adversarial relationships. FX is not a zero sum game. Both liquidity providers and liquidity consumers can both win so it is about fostering a constructive dialogue,” says Hopkins.
While global regulatory frameworks such as the FX Code of Conduct have mandated firms to demonstrate best execution and transparent trading practices, it’s the recent advancements in trading technology, automation and algorithmic trading that have expanded executions methods far beyond request-for-quote and therefore means participants need to understand their respective execution performance to apply accordingly, says Oleg Shevelenko, FXGO Product Manager at Bloomberg.
“Clear trade benchmarking and analysis process is helping remove a lot of subjectivity from the execution process and turn the art of trading into the science of calibrating the inputs to fit the expected trading outcome,”Oleg Shevelenko
“In addition to more rigid requirements for trading, regulations helped bring the concept of market and trading data transparency into the forefront of innovation for the traditional buy-side. Besides traditional real-time pricing offered by the major platform providers, the two other very valuable data sets include market participant owned historical trading data and peer data often included in TCA packages. Those data sets are designed to not only help evaluate and benchmark past trading performance and assess the true cost of trading but also become a valuable input into subsequent pre-trade decision making when executing similar trades,” says Shevelenko.
“The aforementioned regulatory developments and advancements in trading technology have brought a vast amount of data and created a new market challenge on how to analyse those ever-expanding data sets in a meaningful way and how to enhance the execution process by embedding the data analysis,” says Shevelenko.
“It is fair to say that due to the fragmented nature of FX market and lack of commonly accepted and accessible data sets and benchmarks, we are still in the beginning of the journey. However, while the market is going through the data and transparency evolution, we can already observe an emergence of a clear feedback loop consisting of data-driven pre-trade analysis, collection of trade data points during execution, evaluation of trading performance and finally feeding this information back into the pre-trade process,” says Shevelenko.
“Applications of such continuous data collection and evaluation cycle include more meaningful selection of an appropriate execution method from simple risk transfer trade to complex algorithmic order as well as enhancements to the liquidity overall by having transparent and objective discussions with liquidity providers based on the evidence offered by data and analytics,” he says.
There are a variety of inputs in the execution process which can significantly affect the outcome, says Shevelenko. “Those inputs range from market conditions such as liquidity and volatility to order parameters such as size and target duration all the way to the selected execution strategy and its parameters. Clear trade benchmarking and analysis is helping remove a lot of subjectivity from the execution process and turn the art of trading into the science of calibrating the inputs to fit the expected trading outcome.”
“FX workflows are very involved and complex covering not only the execution cycle but order creation, netting and optimization. As such the execution process is often spread over a period of time combining both the risk of potentially missed trading opportunities and the benefits of internal netting. One of the primary goals of platform providers offering such workflow tools is to collect all the relevant timestamps and market data, overlay them with trader activity to present clear and digestible impact and scenario analysis of all decision points. An aggregated view of a defined timeframe could suggest various actionable enhancements to the workflow and trading process,” says Shevelenko.
Is there a role for new technology like artificial intelligence (AI) and machine learning (ML) in the execution analysis process? “Statistical analysis has been and remains one of the common methods of rationalizing historical data in FX,” says Shevelenko. “As AI and ML are the modern extensions of statistical analysis, their use in post-trade evaluation will continue to expand and increase especially as the underlying data sets continue to grow. As trading and post-trade methodology needs to be explainable and verifiable, it is likely that traditional statistical analysis and modelling will continue to be the primary tool with AI and ML offering required data analysis support,” he concludes.
One market trend forcing FX trading firms to get a much deeper understanding of their execution performance is the focus on pre-trade analytics which, according to Chris Matsko, Head of FX Trading Services at FactSet, have taken centre stage when it comes to data interpretation. “The next generation of FX analytics is effectively moving data from post-trade world and into pre-trade and even in-flight real time trade monitoring. Post trade analytics or LPAs are now feeding curated liquidity pools for future executions across both RFQ and Streaming execution disciplines. We’ve also seen a great deal of interest in monitoring algo slippage against a number of benchmarks in real-time, i.e. in-flight TCA. The algos can be LP-based or proprietary, but the key to this scenario is to be able to potentially pause or cancel the algo should market dynamics begin to degrade its performance against the client benchmarks,” says Matsko.
He also thinks there is a role for AI and ML in the next generation of FX execution analytics. “As the TCA feedback loop closes, it’s easy to visualize how AI/ML will play a role in how data is analysed and actioned. The possibilities are endless, think about how much data flows to the buyside daily (we see approximately 36GB compressed), the opportunity for AI/ML to digest and analyze this data in real time could have a huge impact on execution quality,” says Matsko.
“The next generation of FX analytics is effectively moving data from post-trade world and into pre-trade and even in-flight real time trade monitoring,”Chris Matsko
“It is about having the machine understand via a real-time API which LPs are skewed in the market and then create a curated liquidity pool based on those skews plus any historical outcome data around similar orders. That’s just one simple example and doesn’t require much ‘deep learning’ but it’s baby steps. Looking ahead, it seems almost an imperative to have some interoperability with firms specializing in AI/ML applications for fintech,” says Matsko