The past few decades have seen what can only be described as a revolution in data. A generation ago, businesses from every industry rushed to develop an online presence. More recently, building upon the exponential growth of the internet, companies from every sector of the global economy have invested heavily in understanding and effectively leveraging data to their advantage. This explosion in data collection and analysis has been aided in no small part by the twin revolutions in mobile and social, as well as the emergence of global advertising powerhouses such as Google and Facebook.
These have been the first firms in history to accrue such rich datasets of user behaviour at scale and have pushed the boundaries of what they can be used for.
Your marketing team is currently engaged in making the most of the traffic and conversion data that your various online endeavours are generating. However, your brokerage also has access to a massive supply of market and trading data that, when sliced the right way (both in real time and after the fact), can be a rich source of actionable insights, optimisations and potential cost savings for your business.
Slicing the sushi
Excluding marketing and sales, your data ecosystem roughly breaks down as follows:
- The various data feeds you’re paying for. These include, price feeds from liquidity providers, exchanges, or third party vendors. These data are available only to the subscribers of those various services.
- Data from your own trading infrastructure. Among other things, they detail your clients’ trading behaviour across time, the performance of your various liquidity providers, and the transaction costs associated in dealing with them. These data belong exclusively to your business.
- Publicly available data such as market news, economic reports, and sentiment data. These data help put your client behavior and LP responses in context. They can also be used to develop models and test response scenarios, which can include client reactions to the release of economic data, broader market reactions, and how your LPs perform in such scenarios. Sentiment data can also be used to spot changing trends in market segments, instruments, or individual symbols.
It should be noted that each of the above can be further divided into real-time and historical analysis. As you’ll see below, there are important optimisations that can be made in response to real time data, but also through the analysis of historical data for back-testing purposes.
Real Time data
Every day your business generates a mountain of data through your LP relationships alone. Being able to monitor this data for spikes and other outlier events, as well as for spread changes and deviations from other venues, confers a number of benefits on your brokerage. Assuming that your systems are set up correctly, you can use this data to hot-swap between LPs at critical times without any disruption to your clients.
It also allows you to introduce important redundancies in an efficient manner, as well as dampening the volatility in pricing that may arise from being married to any single source. A market data provider dxFeed, for example, offers a multi-source FX reference feed with adjustable component weights. A similar approach can be used to aggregate data on any instrument across your own liquidity providers.
Where execution is concerned, the right order routing and execution technology, when informed by the above insights, allows your clients to execute on a volume-weighted average price aggregated across your liquidity providers.
The ability to look back and analyse historical data is just as important as anything you can do with your real-time data. Historical market data can be used to run historical “post-mortems” on market reactions to everything from routine economic data releases and geopolitical news, to “black swan” events. This information can then be used to create dealing strategies and back-test them against market data to see how they would have performed. Again, this can be done on a per-feed, or even a per-symbol basis.
Additionally, your Level2 data is much more valuable than you might think. There are plenty of insights to be gleaned from back-testing on full histories of Level2 price data. These include putting the rejection rates, partial fills and slippage of your various LPs under scrutiny, as well as conducting transaction cost analysis in order to compare the pros and cons of each liquidity provider’s service.
Historical data can be used to build a profile for each of your liquidity providers, allowing you to compare their relative performance using a number of different variables and market conditions. Aside from being useful in negotiations and in the building of liquidity relationships, these techniques also allow you to make informed decisions on how to distribute your order flow, and how to best manage spreads and commissions at different times.
The proliferation of social media has led to an avalanche of freely available opinion data, just as the tools and techniques for analysing this data are becoming more powerful and more accessible. Sentiment analysis has come a long way in recent years and brokers would do well to pay attention to it because, again, when the data are sliced the right way, they can yield all kinds of insights.
Just as humble “heatmaps” of client positioning can be informative to your traders and even encourage them to trade, mining individual mentions of stocks or cryptocurrencies in news and social media posts can be highly useful to your traders. It may even offer your dealers a timely heads-up as interest starts to build around a given asset that you offer. A modern, multi-asset brokerage is forced to be aware of how a bewildering array of asset classes, instruments and symbols are trending. Especially considering the unpredictability of modern markets, and the growing importance and influence of retail traders. Sentiment analysis, whether performed in-house or subscribed to as a service, can be a valuable addition to your existing toolkit and is relevant to both your traders and members of staff.
For our part, dxFeed has a number of sandbox projects that are experimenting in using freely available social media and news data to develop technical indicators and trading strategies that rely on analysing public sentiment. In a recent study, our teams found that a trading strategy built around sentiment from the @stocktwits Twitter account was able to outperform both idle and risk-free strategies, though the statistical significance of these results remain to be confirmed.
Cost savings and regulatory advantages
We’ll conclude with a couple of miscellaneous data-related tips and tricks that can help you cut costs and aid you in your regulatory standing. Firstly, reference feeds are commonly used in order to monitor the quality of the feeds coming from liquidity providers. In the case of certain commodities, derived data from exchange feeds can be prohibitively expensive to licence. In these instances, referring internally to futures price feeds for purposes of quality control, can be a significantly cheaper solution.
Finally, it pays to remember that investing in your data collection, storage and analysis infrastructures confers other synergistic benefits on your business that go beyond managing liquidity relationships or arming your dealers with responses to different market conditions. A trading infrastructure that’s been developed with the kinds of data analysis described throughout this article in mind, is also more than capable of easily and efficiently meeting even the most stringent of reporting requirements.
Focusing on reporting and working to build close relationships with regulators is crucial as it allows you to stay ahead of upcoming regulatory changes that affect your business, while staying in the good graces of your regulators and avoiding sanctions. A solid relationship with regulatory authorities also confers an air of reputability and legitimacy to your operations that’s useful in marketing your services and has been found to be important to prospective traders.