The shift to automated FX trading has been huge. Electronic dealing grew to 66% of all currency transactions in 2013, more than triple the fifth of the market it accounted for in 2001, according to Aite Group. The Boston-based consultancy forecasts that will rise to 76% of total volume in five years and account for around 81% of spot trading – the buying and selling of currency for immediate delivery.
Such rapidly-evolving markets mean how a company manages its data can be the key to its survival. TABB Group believes finding new ways to store and manage big data will continue to be a key imperative for the future of FX trading. “Vital information is often inaccessible, non-existent or overlooked, resulting in missed opportunities to uncover hidden patterns, relationships and dependencies,” said the financial markets strategic consultancy.
Regulation drives shift to electronic trading
The shift has partly been driven by regulation. Stricter trade reporting rules, anti-evasion authority, business conduct standards and Basel III capital requirements are all prompting FX marker players to shift from swaps and forwards into futures. Regulators are seeking to push more trades onto electronic platforms and more transparent regulated exchanges by making some transactions more expensive for banks.
Kevin McPartland, head of Greenwich Associates’ market structure and technology advisory service estimates that a 5% move out of OTC FX derivatives into futures would cause FX futures volume to grow by over 50%. Already higher trading and clearing costs for NDFs have cut the number of investment firms using the products to 35% of investment firms in 2012, down from over half in 2010.
The longer-term move onto electronic platforms has also been accelerated by investigations into the rigging of interbank lending rates in the $5.3 trillion-a-day currency market. Growing client demand for greater market transparency in transactions and pricing charges has prompted many banks to cut staff and rethink their trading strategies.
“The market is undergoing a data evolution with respect to firms ensuring that they have a detailed understanding of the nature of the data that they are consuming and onward distributing internally and externally,” said Emmanuel Doe, President of the Trading Solutions Group at Interactive Data.
“This is being driven by market regulation, particularly in the US and Europe, whereby sell-side market participants and registered swap dealers are required to display and retain larger quantities of data for extensive periods of time, as well as provide transaction reporting.”
Banks, asset managers and investors are also seeking new ways to preserve their margins in difficult markets. Cubillas Ding, a research director at Celent’s securities and investments practice in London, estimates industry players may have to reduce costs by 20-30% in the coming years to meet regulations and please increasingly demanding clients. He predicts much of this will focus on technology infrastructure, meaning many back-office functions will be outsourced to shared service providers in order to boost scale and operational efficiencies.
“In the coming years, we believe that the battleground for much of the industry will be cost-effectiveness, particularly around infrastructure,” Ding wrote in Celent’s Risk Management Outlook for 2014. “With heightened uncertainty around structural and regulatory change, there will indeed be additional cost impact and complexity. Opportunities and threats in the financial services industry are now inextricably tied to risk practices and technology infrastructure.”
The lure of new technologies
But it’s not only regulatory and cost pressures enticing market players to make the most of their data. They are also being attracted by the more advanced quantitative trading strategies now available to them. FX traders are now employing many of the advanced technology originally developed for other asset classes as regulation and cost concerns drive clients to apply best practice across different trading desks.
Peter Eggleston, head of quantitative solutions and innovations for Morgan Stanley’s fixed income e-commerce platform Matrix, said the bank is leveraging its existing expertise and infrastructure in equities to improve the functionality of its FX products. That’s been particularly effective in its algorithmic trading product suite, which has tripled its client base in the past year by attracting new players such as global treasury centres of corporates and discretionary macro hedge funds.
Morgan Stanley is not alone. While global FX markets are still fragmented compared to other asset classes, the growth of global platforms has driven a sharp rise in computer-generated trading. According to the Bank of International Settlements, algorithmic trading at EBS increased from 28% to 68% of volume between 2007-13, and that trend is continuing to accelerate.
The need for speed
The need for speed is also pushing FX market players to operate more like their peers in other asset classes, such as equities. Low-latency trading has moved from executing a transaction within several seconds to milliseconds, to microseconds, and now nanoseconds. Nowadays, a millisecond improvement in network speeds offers competitive advantage for financial institutions.
“A more competitive trading environment driven by the accelerating use of algorithms, containing trading costs, unsettled regulatory policy and rapidly advancing technology has impacted the bottom line. This has pushed quantitative trading firms to other asset classes – most notably FX,” said Louis Lovas, Director of Solutions at OneMarketData.
“Quants and portfolio managers look to capitalize on the market’s resulting gyrations, exploiting inefficiencies from human behaviour to market structure. The key enablers are effective data management; low latency is simply the ante to play the game. This has now migrated to the FX markets increasing the demand for deep data over longer time periods.”
This trend is apparent in New York, Tokyo and London, and increasingly in the Asian financial centres of Singapore and Hong Kong, which have developed a critical mass of FX market players. That has not only created a large enough pool of liquidity to attract more low-latency and algorithmic traders, but also ramped up competition to the point where the slightest competitive advantage can make all the difference.
“While external forces such as regulation differ significantly, the currency markets are largely following the same trail blazed by the equities markets when it comes to co-location and latency,” said David Taylor, Vice President of Product Management at Exegy.
“Like other asset classes, FX market participants are converging in major regional sites. Convergence in a manageable number of sites makes co-location feasible. With co-location of a critical mass of liquidity providers and takers, latency immediately emerges as a differentiator among competitors.”
Fragmentation breeds new infrastructure
But the globally distributed nature of the FX market across the US, Europe and Asia has made it difficult for traders to alleviate latency caused by physical distance. Attempting to trade even a very liquid currency pair requires collecting data from numerous locations scattered in different time zones to ensure optimal execution. Depending on the location of the trader and the matching engine, latency can vary widely from less than 15 milliseconds for a local transaction to close to 300 milliseconds for cross-border transactions.
In response, vendors have started to develop new market data infrastructure in a bid to capitalise on the growing need for speed. Companies such as Market Factory have launched new ultra-fast feed handlers specifically for the FX market that facilitate data flow between and within companies in fractions of a second. The growth of these products has sparked investor interest: Fixnetics, which provides ultra low-latency hosted solutions for FX and is 25% owned by NYSE Euronext, got a vote of confidence when it raised $15 million in a combination of non-dilutive loan stock from shareholders and new facilities from Barclays in September.
Still, Dan Hubscher, Director of Marketing at Object Trading, argues that speed alone is not enough for traders to beat the competition anymore. “While having fast-performing trading technology may have been a key differentiator, this is now changing,” he said. “Low latency systems are necessary but not sufficient to competing in the markets. Firms need to worry less about speed and more about having a solid trading strategy in place.”
Yet one of the most significant integration challenges customers face when trying to integrate such new technology is uncovering issues in the legacy systems that they replace, said Exegy’s Taylor. “We occasionally are called upon to assist customers in resolving functional issues caused by the accumulation of patches and workarounds over the years,” he said.
Exegy’s most popular product is its hardware-accelerated Ticker Plant, which consumes real-time market data feeds, uses it to perform value-added computations such as book building, user-defined composite views and basket calculations, and distributes subscribed-to content to multiple applications. The Exegy Market Data System is similar but scales up for thousands of users. Exegy has also just launched an embedded software product, the Exegy Trading Application Platform, for smaller, geographically distributed cases.
OneMarketData’s Lovas, whose flagship OneTick product uses a time-series tick database, a complex event processing layer and an analytics engine to extract value from large data sets, said another challenge for clients is dealing with the vagaries of market data.
“Financial practitioner’s worst fears are spending more time processing and scrubbing data than analyzing it,” he said. “This includes normalizing multiple FX data sources, tying indices to their constituents, ingesting cancellations and corrections and inserting corporate action price and symbol changes in equities and options markets.”
Searching the cloud for answers
One way market players are seeking to optimize their data management is with cloud computing. Definitions of the cloud vary, but one more generally-accepted description from the National Institute of Standards and Technology calls it “a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.”
Much of the functionality provided by cloud computing is similar to older systems, only more reliable and efficient. Being abstracted, it can also evolve as new technologies emerge, making it not only agile but scalable and extremely cost-efficient. It means that investment in technology migrates from capital expenditure – in which IT systems and architecture are purchased in a costly upfront purchase – to operational expenditure, similar to a pay-as-you-go model. Some researchers estimate that moving to the cloud can strip out more than a fifth of IT costs.
“Outsourcing connectivity makes sense because it offers banks faster, more cost-efficient access to the marketplace,” said Hubscher. “It also allows staff more time to focus on customer-centric innovation that creates competitive differentiation. Simplifying the infrastructure, using vendor-supplied and vendor-maintained systems, can allow a sell-side’s IT team to focus on innovation.”
These benefits have made cloud computing one of the fastest-growing sectors of the global technology market, with public cloud services estimated to have grown 18.5% in 2013 to $131 billion, according to technology research firm Gartner. Infrastructure as a service, including cloud compute, storage and print services, remains the fastest-growing segment of the market, expanding by 47.3% in 2013 to $9 billion.
The growth of cloud computing is also self-reinforcing, according to TABB Group. “The more cloud-based applications and managed services are accessed by a centralized user community with numerous devices, the more data will be created and backed up in the cloud,” the consultancy said in a recent report.
Yet few FX traders have made the leap into cloud computing. A recent survey by OneMarketData found that 72.7% of respondents are not yet leveraging public cloud services to support trading due to concerns about security, control, flexibility, performance and latency. Four out of 10 cited a lack of readily available integration to data sources as a barrier to adopting cloud technology. And only around a quarter thought that the cloud would be suitable to be used for trades and executions.
“Fear of the cloud is still prevalent in capital markets based on three primary reasons - perceptions on security, cost concerns on whether it is really cheaper and performance implications of virtualized CPU sharing,” said OneMarketData’s Lovas.
“However, these fears are slowing subsiding as the benefits of cloud begin to show through. The days of cloud-as-a-fad are over. It’s a game-changer creating a major paradigm shift in business initiatives with its vast computational power, storage and a wide variety of application solutions at a lower cost structure.”
Cloud computing may start to take off
Lovas believes 2014 could be the year when cloud computing really takes off as companies seek to make new technology savings through hosted services. More than three-quarters of respondents to his survey, which included investment bank, asset management, prop trading and academic community, said they are thinking about starting to use cloud technology this year, primarily for data-intensive back-office processes. Top among them are data storage – deemed the most suitable for the deployment of cloud technology by 78.2% of those asked – large-scale trade model back testing, quantitative research, and transaction cost and post trade analysis.
Interactive Data is already using the cloud to provide customers with bulk access to years of tick data, as well as on-demand tick and summary information. “Large extensive data sets are needed for each of these activities, therefore enabling access through an ‘on-demand’ utility model through cloud is a sensible approach,” said Doe. “These content sets, packaged together with our ability to provide small, medium and large scale hosting deployments across our global co-location data centers, provide our customers with compelling economies of scale.”
The growth of social media as a tool for FX data dissemination is also expected to drive take up of cloud technologies. Market data, news content and social media content – all of which are playing an increasingly important role in FX trading – were also deemed suitable for storage and deployment in the cloud by more-than two thirds of those surveyed by OneMarketData. That’s particularly important given that the use of data from companies like Twitter and Facebook is expected to grow exponentially in the coming years.
TABB Group, in its February report entitled: Capital Markets Data Storage; Strategic Imperatives for a New Computing Era, said some institutions are also migrating services such as human resources, accounting, archiving and disaster recovery to the cloud. That’s partly because the cloud offers a particularly good option as a standby virtual server environment for archiving and disaster recovery. It is also particularly effective for mid-tier institutions, who are seeking to cut costs by bypassing private cloud investment altogether.
“Rather than incur the costs of building and maintaining infrastructure, paying software maintenance fees and supporting IT and compliance staffs, smaller firms are moving directly to cloud and managed service providers in order to compete more effectively with their larger counterparts,” the report said.
The proliferation of algorithmic and low-latency traders has also ramped up the pressure to cut the turnaround time for new strategies. On average, the entire process of idea generation to implementation of can take anywhere from 10 to 28 weeks. Given that many short-term strategies only remain effective for three to four months, rapid construction and implementation of new alpha models is becoming more urgent.
Jared Broad, founder of cloud-based FX data analysis library QuantConnect, argues that only cloud computing is powerful enough to perform the complex data analysis required by low-latency traders today quickly enough. His own firm gives clients access to a powerful cloud-based programming library where they can access big FX data – every price movement in FX markets since 2007 – to develop and test their own programs quickly and at a reduced cost.
“There is so much data being created in the market today you need more than one computer to deal with it,“ said Broad. “Tight regulations on data security, and fears of trading strategy theft stop larger institutions adapting to use the cloud. Some even still use the same excel spreadsheets and terminals they had in the 80s. Only forward thinking quantitative hedge funds have similar levels of technology infrastructure.”
Expanding OTC data coverage
The shift towards electronic trading is also opening up huge opportunities – and creating new challenges – for data distribution providers that provide consolidated feeds of OTC FX market data. Participants in the OTC markets, faced with increasing regulatory demands requiring enhanced transparency and market prices, can no longer rely on sourcing free data from brokers’ trading desks. Many are also seeking to cut the cost and maintenance required by the infrastructure for direct feeds, the complexity of managing multiple feeds, the constant changes from feed providers, and the complexities of matching the various symbologies
“Given the significant growth of direct market data feed usage, one might assume that consolidated feeds are dying, especially given the decline of terminals as primary trading vehicles,” says Sang Lee, Managing Partner at Aite Group and co-author of a report on the subject. “While direct feeds certainly serve a certain segment of traders, consolidated feeds are generally better for those moving into new geographies and asset classes.”
Taylor said Exegy’s technology generates real-time consolidated feeds of OTC FX market data that offer clients access to highly-customized data sets. Clients can subscribe to real-time updates for specific instruments from specific markets – including a particular liquidity pool within a venue. The company’s API allows applications to source data from over 200 real-time market data sources, including ECN and bank currency feeds, in real-time.
Interactive Data has also developed new, more flexible commercial models and expanded its OTC data coverage. In January, the company signed an agreement with BGC Partners to provide its globally available OTC asset class data via its Consolidated Feed and through the 7ticks network. The company offers co-location services that used to be only relevant for ultra-low latency exchange-traded data and access to post-trade data, as per new regulations, that opens the market up for new and innovative value-added analytics, according to Doe.
Providers of live and historical data are also catering to growth in low-latency trading in FX markets by delivering data with enhanced attributes for use in trading models, algorithms and analytics. Lovas said OneMarketData’s flagship One Tick product includes over 130 built-in analytical functions and a visual modeling tool to easily assemble the semantic logic for the algorithms behind quantitative research, trading modeling and transaction cost management. “This includes order book consolidation and spread analytics, volume patterns, volatility and regression – just to name a few,” he said.
Next-generation disruptive technologies
Other disruptive technologies are also driving change in FX trading. The evolution of Complex Event Processing (CEP) technology is playing a disruptive role across the FX space and other asset classes. In its infancy, first generation CEP focused largely on areas like algorithmic trading. Today, a second generation implementation phase is increasingly seeing CEP being extended out into fields such as data mining, order routing, market surveillance and compliance.
Automated storage tiering is another method being used to improve the management of big data. It automates the critical processes of classifying data at the application layer and the movement of data to the optimum storage tier in order to maximize utilization. Given that around 60% of most organizations’ data is user files, of which four-fifths is seldom used 30 days after its creation, this both improves performance and cuts costs. It also supports high-frequency trading applications, which require sub-second support times and are bandwidth intensive.
“Auto-tiering is a must have for solid-state drives, high virtualized and cloud-based storage environments,” said TABB Group in its February report.
Some providers are also offering technology to cater to the needs of an increasingly mobile trading community, who want to be able to access low latency market data across tablets, the web and mobiles. While these products can’t give direct access to the critical hardware needed for ultra-fast trading strategies, they can offer access to a “thin front end,” as Lovas describes it, to such systems. QuantConnect’s Broad said he is hoping to launch an application for tablets next month that would give clients access to his programming library of big FX data.
Taylor said Exegy’s Ticker Plant and Market Data Systems are already servicing a broad spectrum of applications, including desktops and databases that drive web and mobile applications. “The Total Cost of Ownership advantages of our hardware-accelerated appliances make them attractive solutions for a large enterprise with a diverse set of applications,” he said.
But Interactive Data’s Doe believes the new world of FX trading doesn’t need any more disruption – all the technology is already out there. “The reality is that there isn’t any new technology needed at all; FX is just becoming more automated,” he said.
“The technology required for distribution in low latency form–co-location hosting services, electronic matching engines, consolidated data across more transparent markets, CEP technologies, high speed, reliable WAN distribution, etc.–has existed for many years as driven by exchange-sourced data. FX is simply ‘coming of age’ in a low latency sense as it becomes more electronic.”