Are there new tools available to help identify the cause and effect of specific trading techniques and suggest new combinations of algorithms to adopt? Similarly, are there tools available to help back-test and fine-tune new execution strategies prior to deployment? Or better still are there some kind of metrics that can be used to measure the success of specific algorithms or to determine whether they are using an optimal combination of FX algorithms?
A clear sign that FX algorithms are becoming more popular is the fact that the major sell-side banks, Credit Suisse, Deutsche Bank and Citi among others, are continuing to develop and market new execution algorithms for their buy-side clients. Deutsche Bank has just released its AutobahnFX Algo, which is designed to fine-tune its FX execution service by offering a greater choice of strategies. The three algorithms on offer are named Limit Order+, Slicer and Limit Order Slicer.
The first strategy, Limit Order+ is aimed at clients who want to execute their trades at a specific price and are looking primarily for price improvement, while the Slicer algorithm is designed to minimise market impact for traders looking to place large orders by splitting these orders and drip-feeding them to the market. The final strategy, Limit Order Slicer, combines this split and drip-feed capability with the additional feature of leaving orders to capitalise on favourable market movements.
"Clients can now decide in advance exactly how large orders will be executed: in what size, what frequency and at what spreads," says Ian O'Flaherty, global head of FX e-commerce at Deutsche Bank. "They can see, second by second, how their order is progressing and can stop, pause or amend trades at the click of a button if market moves become unfavourable during execution."
Meanwhile Citi has just launched Ripple, its second algorithm from its CitiFX Intelligent Orders product suite, which has been described by the bank as the "second generation" of algo services for the FX market, following its initial algorithm - SilentPartner which was released earlier in 2009. According to James Dalton, director FX ecommerce, Citi, Ripple works orders against the liquidity available through its electronic trading network. Fill speeds and prices achieved are optimised according to current market conditions and to minimise the risk of slippage.
In terms of whether customers favour customised or off-the-shelf algos, Dalton thinks it is important for them to know exactly what they are looking to achieve in their trading. "What is important to consider here, is not just the differences between the different strategies, you need to begin with an understanding of the underlying motivations for a trade to be executed in the first place. In FX there are many different players with a wide variety of business drivers and different ideas of what the right strategy may be."
Any decision should take into account current market conditions (liquidity, volatility), upcoming structural events (fixings or option expiries) and the amount of risk the trader is prepared to carry while an execution is in process, says Dalton. And increasingly this decision is leaning towards customised algorithms rather than the commoditised, off-the-shelf products. "Our experience has shown us that customised strategies are going to do a better job than say a vanilla time-slice or pegging solution because they are far more adaptive to market conditions at the time of trading. I would take this a step further and state that off-the-shelf solutions, are more likely to be mechanical than intuitive and therefore require a greater degree of oversight by the trader when in use."
In terms of how FX tools can help to identify the best combinations of algorithms to adopt, market experience is vital says Dalton. "The market has experienced sufficient distortions in recent years to become well aware of that fact that what works on one day, may have to be discarded the next. What we've learnt can be applied at both a macro and micro-structure level. While retrospective analysis and back testing are mandatory steps in the development cycle, there is no substitute for the 'live markets' where road testing must be completed on algorithms before handing them over to clients. In this business you need to be able to monitor progress of any order in a real-time environment. The tools available to us are extremely nimble and subtle adjustments to logic can now be deployed within hours as opposed to days or weeks."
Dalton also envisages that as advanced FX trading strategies such as statistical arbitrage continue to grow, then there will be an increasing amount of focus on the techniques required to 'cloak' an execution on behalf of a client. "There are various ways to approach the problem, while order routing and placement strategy is important, the introduction of dark pools of liquidity and internalisation models will ensure that the sophistication of methods applied increases rapidly. Aside from the techniques that evolve, this problem will also have an impact on the structure of the interbank market. Flow will be attracted towards exchanges and broker venues that can offer good liquidity as well as dealing protocol that protects against undesirable behaviour."Kim Bang, president and chief executive of Bloomberg Tradebook, agrees with Dalton's view that the choice of customised versus off-the-shelf algorithms depends very much on how they are going to be deployed. "The first question relates to the intended use. If it is for trade idea implementation, then there are a number of very good broker-provided algos at your disposal," says Bang.
"In choosing a good implementation algo, you want to find a provider with access to high quality market data, an extensive network of liquidity venues and market participants. You would want the necessary trading tools at your disposal to source liquidity across multiple venues with maximum spread capture and minimal market impact. On the other hand, if the algo is intended to be an alpha-generating trading strategy designed to buy low and sell high, then you need to invest in proprietary research and development."
There are firms who can provide market data and back-testing for complex events processing (CEP) says Bang and he describes how Bloomberg's Tradebook product provides access to Bloomberg's market data over an API with a number of base trading modules for currency pairs, multi-asset class trading, news releases and economic announcements, and 'if then' trading strategies. These trading applications can be customised and automated leveraging Tradebook's suite of execution DMA and algo strategies.
Bang believes that Tradebook's approach will indeed help clients to identify the cause and effect of specific trading techniques and suggest new combinations of algorithms to adopt. "We are fully integrated into the Bloomberg Professional service that has many of those technical and quantitative analytics to assist traders in identifying profitable trading opportunities. For example, traders can set alerts on FX options activity or a Tom Demark market timing indicator and trigger a spot transaction."
As for the impact on the future development and implementation of FX algorithms that will result from the evolution of more advanced FX trading strategies, Bang emphasises that any venue wanting to attract order flow from high frequency traders will have to build in certain measures to protect themselves from arbitrage and to ensure short-term cash flow continues. "Many dealers and ECNs are interested in attracting trading volumes and growing market share," says Bang.
"High frequency traders have the potential to trade significant volumes but tend to be liquidity takers and pose a challenge to traditional dealers. As a result banks are building quote price protection, client profiling and reverse algo engineering in order to minimise adverse results and to anticipate short-term money flows. If you are a small lot trader, this environment is fine for trading $1-5 million. If you are an institutional buy-side money manager or hedge fund trader dealing in amounts greater than $10-25 million per order, you are best served in a liquidity venue that caters to institutional money managers."
Traders' decision-making when choosing their algorithms would be helped by evidence of both a rigorous back-testing process prior to deploying the algos and series of proven benchmarks with which to measure the algos' effectiveness once they are in play.
Bang highlights Bloomberg's years of experience in developing algos for institutional customers trading equities across multiple liquidity venues, as have most sell-side banks and brokers developing commercial algorithms. And Bang envisages this experience being employed to bring these benefits to the FX market and those traders seeking conflict-free, anonymous, high-quality algorithmic execution. "We have recently introduced multi-asset class, complex event processing and a library of core trading applications that can be customised," he says.
As with other tools used to improve execution, such as smart order routers, the biggest challenge facing the users is being able to measure the performance and finding the right tools in order to carry out this assessment. In the equities world, many buy-side heads of dealing have been using a combination of transaction costs analysis (TCA) supplied by the likes of ITG as well as feedback from the brokers supplying the algorithms in question. Not only can the brokers supply the statistics for each order, they can also offer feedback on how each order was executed and whether they should have been traded more aggressively in some instances, for example. Yet with the average trading desk using 12 brokers and receiving six algorithms from each one, it can be an arduous task to whittle down over 70 algorithms into a workable short-list of less then 12 without resorting to some form of trial and error.
According to Kevin Houstoun, chairman of Rapid Addition - a software company that produces tools for monitoring trading performance, the trial and error element to using algorithms could be reduced by the availability of some kind of utility that pulls together aggregated execution data so that firms can assess their execution in comparison with everyone else rather than individually.
"They need to be able to simulate lots of trades to see if they were just unlucky when they traded," says Houstoun. "The ideal position would be to see what happened when other firms used the same kind of algorithm in similar circumstances." The assessment of algos is often made harder by the huge variability in performance of infrastructure, says Houstoun. Closer examination of the figures quoted by algo providers can uncover a huge distance between the fastest and slowest execution speeds for the same algorithm. "They should not be asking how fast the algo is but how consistent it is."
Agency broker ITG provides a TCA service for buy-side dealers that enables them to measure their complete transaction costs for all trades in absolute terms against a range of benchmarks. It also runs a peer group measurement service so that managers can see how their trades measure up against a selection of similar managers. More recently ITG has worked on developing a service that isolates the performance of specific algorithms from within these measures of total transaction costs, says Rob Boardman, ITG's head of electronic trading. "This is something that we have worked on - ensuring that asset managers can provide us with richer data, in two areas in particular - execution style and, if an algo was used, what were the parameters; and timing - what was the price of the asset when the manager made a decision, when the order was given to the dealing desk, when the order was sent to the broker and when the order was executed. Getting richer and richer time-stamping data allows us to better protect the alpha that might be in their orders and measure how much alpha might be leaked."
Despite the fact that ITG has been running TCA services for over 20 years it is only in the last few years that buy-side managers have taken more notice of their execution costs, particularly in the equity markets where regulations like MiFID have stressed the need for firms to have a best execution policy. "Those firms that trade a lot have always understood the benefit of TCA but there have also been quite a few sceptics that have questioned the accuracy of TCA," says Boardman. "We are slowly winning them over and now most people accept that TCA has value. It is not perfect but it can always improve by capturing more data of better quality. It doesn't help that there is a lack of consolidated data and the markets are more fragmented than ever but we are trying to improve this and we welcome the debate."
While these services are more geared towards the equity markets, it is clear that many of the same principles apply to the FX market and in time the debate that Boardman refers to will be just as relevant in FX as it is right now in the equities market. As Citi's Dalton states, the same level of forensic attention paid to equity trading costs will eventually be applied to the FX market as well so it is up to the providers of FX algos to be prepared for this eventuality and be able to offer the necessary tools.
"Each client measures their execution performance in different ways, tailoring strategies to ensure that we help them achieve or beat their benchmarks is an important objective," says Dalton. "The microscope that firms apply to their equity transaction costs, has not in all cases been applied to their FX as well. What we offer is a systematic approach to cost reduction, regardless of how you brand or market an execution strategy, clients will only persist with it if it delivers hard dollar savings. We provide clients a clear framework for measuring transaction costs if they do not already have one in place, it is all part of bringing them one step closer to the market and making it easier for them to fulfil the fiduciary responsibilities they have to their customers."
The next step is finding some way to take the tools used to measure the performance of a single algorithm against a single firm's benchmark and then employing it to judge the algorithm's performance. This would be in the context of the wider market and other participants and their algorithms. "Comparing the success of algorithms against each other requires the application of a consistent set of benchmarks," says Dalton.
"The best way to do this is to use identical performance metrics against a sufficiently large portfolio of transactions to ensure the outcomes are meaningful. Historically many organisations have been dependent on a time stamped reference rate. This is fine as long as you do not attempt to execute more than the market can digest at that point in time. As the adoption of execution algorithms increases in FX we are going to see a shift towards more advanced benchmarks that take into account liquidity, dealt volume and price ranges."