We are in a new era of data-centricity when it comes to financial trading so what steps can FX market participants take to ensure they are investing in the right people and technologies to meet their data management requirements?
Market Data is a very high cost item for most institutions and that is exacerbated by the fragmentation of FX liquidity. Market data management means managing, consolidating, coalescing order books across providers for true price transparency. The creation of accurate and reliable price analytics and available liquidity for trade models and TCA are only possible with an aggregate market view.
As firms look to invest in new technologies and the human capital to optimize that technology the right blend of build vs. buy is a key to success. There is a huge risk of getting distracted from the primary goal of capital management to software development, maintain interfaces with market data and trading vendor feeds, wrestle with historical data storage, cloud vs. on premise deployments. It’s an endless list. Success lies in the optimal blend of leveraging cloud infrastructure and commercial software with all the ‘ilities. Data management platforms that offer the ‘iltiies’ a set of core ‘must have’ functional capabilities - scalability, reliability/availability, maintainability/configurability/administration, security/entitlements, personalization/usability and of course performance. Everyone needs these ilities. Then leverage your human capital to build what you need.
How much of a threat is “information overload” to both sell side and buy side FX trading firms and what issues need to be considered to help them develop more effective enterprise wide data strategies?
The incredible growth of the FX market is undeniable. Yet, the salient fact is that all that data is messy. The financial practitioner’s worst fear is managing scale - spending more time with capture, storage and the infrastructure to process it all than actually analyzing data. The fragmentation of FX markets hides true visibility into order book dynamics and liquidity, while precisely coalescing the fragmentation increases the information glut, it also improves confidence in the results of backtesting trade models.
What technical challenges does the unique structure of the FX market create in terms of data capture and analysis and what solutions are being developed to overcome these?
Precision of time synchronicity is always a challenge when markets are desperate and separated by geographic distance. Accuracy of TCA / BestEx is dependent on both precision and synchronization.
More active alpha seeking firms use data as a fuel for their algo and HFT strategies. How important is the Cloud becoming for high performance Quantitative FX trading?
As more firms move from discretionary to systematic trading, sophisticated algos produce a more competitive environment and diminishing returns. That is the incentive to high-tune the algos through backtesting across deeper history. The storage and computation power required for that can only be achieved in a cost-effective manner on elastic Cloud. Large scale backtesting demands hundreds of CPU cores but only for short bursts. The elastic (scale up/scale down) capability of Cloud is well suited to this need.
In what ways are leading providers of live and historical market data setting new standards by delivering FX data with enhanced attributes for use in trading models, algorithms, and analytics?
Historical market data and elastic Cloud as a managed service is available from a number of vendors today. Market history includes tick-by-tick, end-of-day, consolidated depth-of-book for Spot and Derivative markets. These same vendors enhance that Data-As-A-Service offering with unique hosted solutions for BestEx, TCA, Surveillance and backtesting. On-boarding new clients require minimal effort, as the old-school (on premise) deployment headaches are completely avoided.
TWO USER CASES ILLUSTRATING SOME OF THE CHALLENGES OF MANAGING DATA
Enterprise Data Warehousing
With an ever-expanding set of data and analytics creating a greater challenge to find alpha, quants demand accurate time series data and can benefit from removing data management responsibilities. Enterprise data warehousing is a valuable well of efficiency companies can pull from in such times. Big Data is nothing new to the financial services industry as the markets produce over 50 TB of data per day. The ability to collect and aggregate the data once for all use cases is a daunting challenge as each application requires different fields and frequencies of the data
Quants apply an empirically-tested and rules-based approach to exploit perceived market inefficiencies manifested by human behavior, geo-political events and market structure. With tighter spreads, thinner margins and lower risk appetite, quantitative traders are exploring more cross asset trading models and cross asset hedging. Consequently, the quest for new and revised models is never ending. The side effect of this is increasing demands for deep data over longer time periods across a multiplicity of markets -equities, futures, options and of course cross border currencies. This data dump is the fuel feeding automation technology, quant’s research and strategy modeling tools. That technology plays a critical role in the trade lifecycle. Its fast paced evolution goes hand-in-hand with innovations in trading.
Data accuracy is vital to determining outcomes; asset prices cannot be inaccurate or missing. It means dealing with the vagaries of multiple data sources, mapping ticker symbols across a global universe, tying indices to their constituents, tick-level granularity, ingesting cancellations and corrections, inserting corporation action price and symbol changes and detecting gaps in history. Any and all of these factors are vital to the science of quantitative trade modeling. With over five billion options contracts traded in 2014, the reliability of the resulting analytics such as implied volatility, delta and gamma for option strategies depend on underlying data accuracy and reliability. Big Data is about linking disparate data sets under some common thread to tease out intelligible answers to drive the creation of smarter trading models.