Dan Barnes
Dan Barnes

Big Data - FX meets the challenge of a growing information tsunami

FX market firms are seeing a growth in trading volumes that is leaving their trading infrastructures struggling. Dan Barnes explores how, as legacy technology struggles to cope, new ‘big data’ systems offer faster, more reliable processing power.

First Published: e-Forex Magazine 48 / FOCUS / July, 2012

The enormity of the increase in data processing facing the business of capital markets is hard to comprehend. Research firm Celent estimates that the average daily trading volume in the foreign exchange market was around US$4 trillion last year, which represented a 20% increase over the past three years, preceded by an event greater growth of 72% prior to the onset of the financial crisis in 2008. The electronification of trading is enormous. HFT volumes in FX will be in the range of 28% this year, compared to 5% in 2004. The big banks are running 92% of their FX transactions through e-trading systems. As a result, firms are having to invest in their processing capacity to stay on top of the market.  The term ‘big data’ has been coined to describe the capturing and processing of data too large to be held in a conventional relational database, often in real time.  Originally used by online retail focussed firms such as Google and Facebook, who have to capture and process data often...continued

Exclusive Content

The full article is only available to current subscribers. Click here to sign in or subscribe by clicking here