The term “quantamental” is relatively new and typically is used to denote an approach to analyzing the markets and designing trading strategies which take into consideration various fundamental factors along with purely quantitate methods of price time series analysis. Some researchers use the term “quantamental” to denote direct incorporation of quantitative analysis of fundamental data into the decision-making logic of a strategy, others use it to emphasize that the strategy logic is somehow based on their understanding of market mechanics and principles, rather than on only statistics and abstract mathematical models. It is important to note that quantamental approach today is used predominantly by the buy side at the strategy design level, contrary to the execution level, where, according to the latest survey by J.P. Morgan, still over 80% of institutional traders prefer click-to-trade to using execution algos. Nevertheless, as we will see later, quantamental analysis may eventually play a key role in execution algos efficiency, becoming an important part of the overall trading process.
Use of data
First, let’s discuss the use of data which is traditionally considered fundamental by most traders, such as economical indicators, in quantitative models. Typically they are used to make long-term trading or investment decisions, however some fundamental factors potentially allow for very short-term use, because of a sharp increase in volatility which typically follows the moment of an important economical news release. Nevertheless using fundamental data of this kind is considered as a domain which is the hardest to process using quantitative models. Indeed, if we look only at various interest rates as a factor for analysis of a potential impact on the currency markets, which seems quite obvious, we very quickly run into a problem of performing a really huge amount of calculations in order to find potential correlations between the fundamental factor in question and changes in prices.
For example, only the Bank of England posts more than 20 base interest rates. Let’s assume that we’d like to try to find a model which presumably utilizes relationships between changes in these interest rates with changes in the spot, forward or any other tradable contract prices. Furthermore, let’s assume that for such a model we even don’t use absolute values of these factors, but only the fact of a change in any of them: therefore, we record -1 for a change down, 0 for an unchanged value, and +1 for a change up. A brute force search for all possible combinations of these factors yields a figure of 320 or roughly 3.5 bln combinations, for each of which we should perform a test and decide whether such a combination of interest rate changes makes any sense for our model. So, even if we have a very powerful computer which is capable of running 1 test per millisecond, the analysis of only one set of factors may take about 3 mln seconds, or 34 days of continuous calculations.
Needless to say that researchers prefer to use various workarounds like genetic algorithm and machine learning to find potential dependencies between economical data and price movements, but these methods have other shortcomings: for example, the former may easily miss the most important relationships and focus only on minor ones, while the latter may find relationships where they don’t exist in reality, so the output of either method will still require some post processing.
While directly using quantized economical data in trading strategies may be problematic, our research shows that fundamental factors of different kind may successfully work as a foundation for quite a variety of trading strategies.
These factors mostly relate to the market structure and to patterns of behavior of key market participants, both on buy and sell side. In most cases these factors cannot be represented in numeric form directly, but they may serve as a starting point for identifying a market inefficiency which potentially could be exploited systematically. In most cases the best result can be achieved by using fundamental understanding of the market regimes for throttling a purely quantitative model. To better understand these opportunities, let’s consider a couple of simple examples. In Fig. 1 you can see a theoretical performance chart of a simple mean-reversion strategy trading the euro versus the US dollar in hourly timeframe. We can see that although the original theoretical performance may look attractive (blue line in the chart), as soon as we place the model into a slightly more realistic environment subtracting possible commissions and emulating execution issues, the performance degrades substantially (red line in the chart).
Let’s now add some fundamental outlook to our model. Our qualitative analysis of various sources, such as research papers, surveys, white papers, etc., leads to understanding that the activity of the market participants responsible for the general mean-revertive price behavior of the fx market is not uniformly distributed across almost 24 hours when this market is open, and moreover, there are a number of time slots when mean reverting activity could be the most prominent compared to other time of day. Then we perform a quantitative research of these time slots using traditional optimization methods and select only those in which the mean-reversion tendency seems to be the most prominent. If we then add a number of rules to the strategy entry logic which restrict certain entries according to this time slots analysis, we may significantly improve the performance of the strategy, even if it is put in the same unfriendly testing environment as in the previous case (see Fig. 2). This way we not only improve the overall performance, but also reduce the risk of exposure as now positions are held only for a relatively brief period of time.
Another type of fundamental information which is often disregarded by many market participants is the impact of changes in the market structure and internal market processes onto the performance of various buy side trading models. These changes are difficult to quantify due to lack of reliable metrics or benchmarks, and thus they are typically not used in the strategy logic. Nevertheless proper understanding of these processes may help at least avoid periods when the market regime is not favourable for a particular trading strategy, or at best – to improve the strategy performance.
For example, the most recent tendency in the buy side activity in most major FX currencies is extensive use of execution algos which theoretically should serve achieving a better execution compared to traditional voice trading or click-to-trade while reducing the footprint in the market therefore potentially protecting the buy side both from making a considerable market impact and from predatory activity of various kinds. Unfortunately use of execution algos is a double-edged sword, and understanding the essence of these algos with their randomization of entry time and size, gives us an idea about potential liquidity issues which lead to erratic price behavior and consequently to problems with intraday trading strategies in general. To better understand why and how the increased use of execution algos usage may disrupt the existing short-term strategies, let’s consider the distribution of probability of price going to the same direction (for momentum) or to the opposite direction (for mean reversion) year over year since 2007 (see Fig. 3).
When probability is significantly different from 50%, then the market regime could be considered as suitable for alpha-generating strategies, since in both cases it indicates a presence of a certain inefficiency in the market. However when the probability of the further price development in any direction reverts to 50%, this is a market regime which is the most inconvenient for most trading models as it means that the market is nearly efficient. It can be seen very clearly in the performance chart of our model strategy: when market becomes efficient, the equity curve goes sideways (these historical periods are marked red in the chart). We can see that the increased use of execution algos already brought this probability to the efficiency level of 50%, similar to what we could observe in 2009 and 2013, and we can expect a significant decrease in the reported performance of algo fx funds, as we already observed it in 2013.
We can see now how fundamental factors such as changes in the market structure affect the performance of buy-side strategies and understand their reasons. This information can be used not only for forecasting the performance, but also for proactive portfolio management. For example, in the last 3-6 months we might want to prefer currencies with lower execution algos impact, such as the Australian dollar, to those where this impact was the most prominent, such as the euro. Fig. 4 illustrates the relative performance of the same simple mean-reverting strategy in both currencies for the latest 17 months, and it is clear that the Aussie outperforms the euro lately because of the less “random” intraday price behavior.
Of course all examples provided above are somewhat simplified in the sense that we should always take into consideration a number of metrics which are based on various fundamental phenomena. Nevertheless we can see the key advantages of using a quantamental approach, especially compared to traditional fundamental market analysis.
First of all, with this approach actual trading decisions are not to be necessarily based on any particular fundamental data like economical indicators. Second, when used along with traditional quantitative models, fundamental factors may seriously improve their performance in changing market regimes, which makes quantamental models especially attractive in modern, low-volatility markets. And finally, using the quantamental approach we eliminate the necessity of making a discretionary guess since any of the considered fundamental factors cause a certain process in the market which can be ultimately measured – and using quantitative metrics allows for developing strategies which are less prone to human discretion, reducing long-term systemic risks dramatically.