From the surface, aggregating liquidity for FX trading looks fairly straightforward. Most venues support FIX, and trading protocols for spot FX are not particularly complicated. But look under the surface, and things get more complicated. Here are some of the issues:
The FX market is extremely fragmented. There are two primary markets, dozens of ECNs and multi-dealer platforms, and 10-15 critical single bank platforms. Aggregating liquidity from all these sources can be quite daunting. Most tier 1 and top tier 2 players connect to as many as 16 different liquidity providers.
Yet even with all that fragmentation, new venues continue to launch. This year alone, more than 10 new FX trading venues were launched. Not all of them have much liquidity yet, but each offers a unique value proposition that will probably allow it to attract certain types of flow.
Inconsistent Market Structures
The market structures in these venues vary. Some send continuous price streams, some use an RFQ model (request for quote) and some send banded feeds. In most cases, quotes include three pricing tiers. When you send an order, it must indicate the specific price it's targeting. Time-to-live settings vary by venue, and if the quote is expired by the time the order gets to the liquidity provider, they generally will reject the order.
Dynamic Latency Issues
The trading infrastructure, connectivity and architecture are critical to successful FX trading. Network latency is rarely static, so you can't really base a strategy on a median latency figure, because outlier latency could have a serious impact on trading or risk management results. Co-location and proximity matter a great deal when trading FX, but there is a huge associated cost. It's important to do a cost/benefit analysis that considers not only venue location but also client locations when deciding where to place order management and routing systems.
Non-trivial and non-obvious aspects of aggregation
The non-trivial and non-obvious aspects of aggregating liquidity directly affect efforts to trade algorithmically. There are opposing points of view to what information should be exposed to a strategy. One side says that the normalization layer of the aggregator should send as much information as possible to the strategy; and the strategy should contain the logic to know what to hit on what band and make the routing decisions. Others say the normalization layer should be smarter, sending a simplified feed to the strategy and making routing decisions after the strategy makes its trading decision.
Listen to the panel discussion
We recently had a panel discussion about these areas. Our speakers were:
- Ben Ernest-Jones, Solutions Architect for Capital Markets at Progress Software
- Saul Nadata, Product Manager, Thomson Reuters Dealing Aggregator atThomson Reuters
- Sassan Danesh, Managing Partner at Etrading Software
- James Walker, Vice President, Managed Networks Services at Tata Communications
Click here to listen to the recorded discussion. Once you do, I'd like to get your feedback.
You might also be interested in our upcoming event on Tuesday, December 4th. We'll be discussing hybrid voice and electronic trading. Click here for more information.