David Hastings,  Director, Whispers FX Solutions
David Hastings, Director, Whispers FX Solutions

The role of next generation technology in helping FX trading firms to get more from their data

With every passing day a greater volume of data is captured globally. The value of this growing source of intelligence varies from industry to industry, but within financial markets the worth of good quality price data is far above rubies.

With every passing day a greater volume of data is captured globally. The value of this growing source of intelligence varies from industry to industry, but within financial markets the worth of good quality price data is far above rubies. The challenge for financial market practitioners has changed little over the last forty years. To begin with, data must be captured, cleaned and stored. Second, it must be analysed in order to discover patterns which can be profitably exploited.

Today these processes are often automated. Computers capture and analyse data in a method referred to as machine learning. These programmes are designed to discover patterns and continuously adjust their rule-based algorithms to optimize the profitability of these patterns. In scientific terms, however, financial market data is often described as ‘noisy.’ The stability of these price patterns wax and wain and the computers constantly adjust their parameters to attempt to maximise returns, not always successfully.
Three factors have changed the nature of the data-challenge for financial trading firms over the past few decades:

Firstly, a vast increase in the volume of data. As more companies automate their trading processes, transaction volumes increase, and order size diminishes. With less liquidity available at a given price, trading frequency increases, in order to facilitate the same execution of the same value.

Secondly, there has been an enormous expansion in the range of data available. Computing power, which continues to grow at the pace generally referred to as Moore’s Law, allows more data to be analysed quicker. This accelerates the adoption of faster trading strategies, leading to a further reduction in order size and higher frequency to trading. It also means that other sources of data can be captured, which might have an impact on changes in price. The most obvious source of additional data is the depth of the order book – the constant ebb and flow of bids below and offers above the current market price. Further inputs to trading models may be derived from correlated markets. For example, on the foreign exchanges, when trading Spot Euro/Dollar, changes in the forward curve for Euro/Dollar or changes in Dollar/Yen may provide useful inputs into trading models. But there has been a proliferation of other machine-readable inputs. For many years it has been possible to track the price of bonds, stocks, commodities and economic releases, now we can machine-read media feeds. Keyword searches have given way to natural language processing techniques. All these new techniques are simply solution to the age-old problem of deriving signal from noise.

Thirdly, there is the changing approach to risk management. In the days of voice-broking, trades were often input, into risk management systems, long after they had been transacted. In the brave new electronic world, risk can be monitored in real-time. Risk-management tools, nonetheless, remain predominantly backward-looking. When historic volatility is low, margin requirements are reduced, and leverage is permitted to expand. When an unexpected event occurs, risk management responds slowly. Part of the reason for this slow response is the lack of interconnection between risk management across different markets and asset classes. A dramatic move in the price of Crude Oil may impact the US Dollar at one time but not on another occasion. With greater computing power, it is easier to model changes in the correlation between related (and even unrelated) instruments. The next generation of risk management systems will be able to monitor the changes in relationship between a much wider array of instruments in real-time and to monitor the change in the stability of the correlations between instruments. We may not yet be able to identify the Butterfly Effect in advance but our risk management systems starting to become substantially more proactive.

Spotex1
We may not yet be able to identify the Butterfly Effect in advance but our risk management systems starting to become substantially more proactive.

Next Gen Algorithms

Looking ahead, the Next Gen technology already being deployed is based on machine learning. Most of the research relating to AI – Artificial Intelligence – is being trained on more predictable data-sets such as improving facial recognition or analyzing topographic data to improve the reliability of sensors in driverless cars.

For financial markets, the harnessing of new machine learning techniques is most evident in the development of automated trading algorithms. These programs constantly adjust parameters in an attempt to optimize trading strategies to changing market environments. The challenge, as always, is to insure that they do not back-fit too tightly. Teaching machines to identify regime change and still maintain sufficient robustness, whilst simultaneously minimizing slippage on execution, is a tall-task. Nonetheless, in economic terms, the cost benefit of algorithmic execution has long since eclipsed the capabilities of human dealing desks.

Next Gen Liquidity Management

Another area where Next Gen technology can aid trading firms is in the analysis of liquidity quality. We recently discussed liquidity management with an independent FX manager who claimed to be receiving prices from 54 Liquidity Providers (LPs). Managing latency, assessing fill ratios and rejection rates was a full-time role performed by two of his traders. Their days are numbered, the automation of these processes will allow users to analyse liquidity quality real-time. Rule-based algorithms can then be designed to switch between the available panel of LPs, dependent upon the client’s assessment of their performance.

Next Gen Risk Management

As buy side institutions have moved away from single bank liquidity, in favour of Electronic Communication Networks (ECNs), so the challenge of managing credit lines has become more onerous. Currently many buy side institutions do not have the capability to access Prime Brokerage services, but, as regulation pushes these institutions to manage their portfolios on a real-time basis, they will adopt technologies that allow them to undertake margin-based trading.

At this point also, the importance of intraday liquidity will become more pertinent. Forex markets are becoming more fragmented as Non-Bank LPs gain market share. At the same time, algorithmic trading is becoming an increasingly large percentage of daily volume. For risk managers the challenge is no longer to know their open portfolio exposure at the close of business, but also to estimate the length of time it may take to liquidate that portfolio, especially during a period of high volatility and low liquidity. Analysing the depth of order book will become an essential part of the real-time risk assessment process.

Spotex2
Another area where Next Gen technology can aid trading firms is in the analysis of liquidity quality.

Next Gen Trading Strategies

For many years there have been multi-asset portfolio managers allocating capital across multiple asset classes. Often foreign exchange exposure is regarded as a residual, simply, to be hedged. This has spawned a substantial Foreign Exchange Overlay industry, aiming to manage currency risk and generate alpha in the process. Other firms, especially certain quantitative hedge funds, have incorporated all assets, including currencies, into an integrated portfolio. The allocation of capital to systematic strategies continues to increase, the allocation to multi-asset portfolios is also on the rise. The increasing volume and variety of data favours the continued growth of systematic strategies. Machine learning methods are ideally suited to identifying patterns in data and managing the changing relationships between these data sources. Next Gen technologies will enable us to embrace complexity to a degree which was impossible just a few years ago.

Conclusion

Next Gen technologies such as machine learning and artificial intelligence are merely a continuum of the developments which have been evident in financial markets for several decades. What has changed is the volume, variety and complexity of the data sources which are available to be mined for meaning.

Managing capital is about identifying opportunity, managing risk and designing processes which are sufficiently robust to contend with uncertainty. Technology allows us to scale our human capabilities. By reducing cost and increasing our capacity to manage capital, technology enables us to meet the needs of the next generation of investors.

I would like to thank Colin Lloyd of Velador for his contribution to the article.