Data, Data Everywhere
In the beginning there was light…now we have data; in fact we use the light to carry our data. If the light's too bright you buy a pair of shades; if there's too much data, well…you just ignore some of it. Is there such a thing as too much data? Perhaps there isn't, but there's definitely such a thing as more data than you can analyse and process.
For algorithmic trading in foreign exchange to take the next step forward it will need to bring all data streams together and make them accessible to the trading engines.
Market data: contributed, executable, dealt, fixing, historic, tick; transactions data, post-trade data, value at risk (VaR) data; news and economic indicator data. Price data in foreign exchange is simple, after all it's just two numbers, but with a proliferation of trading venues and execution models has come a fragmentation of liquidity. The challenge therefore becomes the need to re-aggregate that liquidity and execute the trades more efficiently than your competitors.
Warp 2, Scotty
It's about the APIs of course. You'd love to have all your rates, orders, done deals, and trade confirmations available through the same API; it's really the only way to get all this information into your programme formulae in anything consistently approaching "real-time". Because the data will be arriving from various sources via discrete feeds, you will also need a market data distribution backbone that adds minimal latency, accommodates all asset classes, and is not bothered by spikes in update rates.
For the algorithmic trading boxes, programmed to act as snipers, speed is everything, to the point where co-locating your hardware with the host is a sought-after advantage. Speed matters. It matters a lot.