Sifting the sands of market data for gems of price and liquidity requires a build of mapping and analytical technology often beyond the resources of buy-side trading desks.
A trader only wants to see relevant information. Every second consumed by getting to grips with a picture, is a second wasted in putting what it shows to use in the market.
“The most relevant information could be axes or inventory that have come in within the last five minutes,” says Dinos Daborn, co-founder and director of technology provider AxeTrading. “They don’t want to see something that came in yesterday, so we give them the tools to view it, all in one screen. Obviously normalising the data when it’s so distinct is the challenge.”
The ongoing trend to build data aggregators on the buy-side credit desk is notable for two aspects. Firstly, the cost needs to be justified and buy-side desks have often struggled to get budget. Overcoming that is often a structural issue around how the desk is funded.
The head of fixed income trading at a European tier 1 asset manager says, “We are setup to control our own budget, it is not like we have to go to the global head of fixed income or the chief investment officer. Our own pricing data project ran over quite a few years, and we have people who only work on our proprietary trading tools, so it is ongoing and we allocate a lot of our IT budget to it.”
For buy-side firms with assets under management closer to US$100 billion than US$1 trillion, getting a dedicated budget for a fixed income trading desk is not so easy. The higher proportion of voice trading in debt markets – relative to equities – has meant the process of quantifying aspects of the trading process are late in development.
The other challenge stems from the technical complexity. Advances in technology are making the use of data and analytics on the trading desk simpler.
Brian Cassin, head of Product and Strategy for North America at market data and technology provider, Vela Trading Technologies, says, “Part of the electronification of fixed income is coming down to computing power becoming less expensive, so we can run software with a smaller hardware footprint, and keep performance to where people would want to see it. Although some instruments are very thinly traded, for the very liquid markets we believe the hypothesis that the market is going to move more and more electronic.”
The most public example of these data analytics systems is AllianceBernstein’s ALFA pre-trade liquidity aggregation tool, the intellectual property for which was sold in May to Algomi, a pre-trade liquidity data provider.
The AB team has embarked on architecting a system that first consumed multiple data feeds for trading platforms and venues, most of which were based on the FIX messaging system, along with others like Bloomberg B-PIPE. After consuming all of those feeds and then aggregating them, the system allowed a trader to see the price for a given instrument, based on ISIN, across the different platforms.
Usman Khan, chief technology officer and co-founder of Algomi says, “Just that information in itself in one view is very, very powerful and it’s all real time as well, so as the prices change, or as more information arrives, they get alerted on that same screen.”
Under the cover
Platforms like this can show depth, prices, and then proprietary analytics to try and derive the price the firm is looking for, using its internal analytics. For firms with the right data, they can also look at how they have traded a bond in the past, combined with trade history and transactional data. Augmenting market data with internal databases gives better guidance.
However technologically this is a challenging beast. Just moving data from one place to the other is not easy.
Mark Corbet, head of connectivity at execution system provider TradingScreen, says, “The FIX protocol is the de facto financial messaging format, but even within FIX itself you have different versions of the protocol, you have got 4.1, 4.2, 5 and FAST. If you are looking at fixed income you are getting individual pricing information from different sell sides and in the same version of FIX some sell sides would use different messages than others. So even with the same spec version, you still can have different variations.”
To offer valuable pre-trade insight, Algomi employs a sophisticated technology stack. It has a graphical user interface, an application programming interface (API) that provides connectivity with other systems and data feeds, and at the heart of its infrastructure is an enterprise messaging layer.
“We have a set of micro services, where each micro service is concerned about a particular operation,” explains Khan. “So a typical operation could be that a customer enters an enquiry into our platform, and that enquiry goes from various states. We use the micro service to manage the state of that enquiry for example, and that’s all done in real time. So all of our micro services, that are managing different actions essentially, are connected via an in-memory data grid.”
As within the business logic operational layer, individual micro services are able to access application states in parallel and independently of each other, so the system is able to run many actions in parallel, to maintain a lot of load.
“Behind that we have a very sophisticated integration layer, which is the result of integrating with a myriad of different systems across 18 banks,” says Khan. “We have learnt how to integrate with different types of databases, different types of message buses, different file systems, chats and image scraping, so we can run object character recognition and to consume information.”
The firm has its own proprietary data schema so for any enquiry about a given bond, it has a client ID, a price, quantity and a large majority of fields that are common to data points it is receiving in different formats.
“Essentially when we do the integration we normalise all those data points into our normalised structure,” Khan explains. “And that normalised structure is then what gets transmitted to our system, put onto our message bus, operated on by these micro services, and the states are then updated in the memory data grid, and finally into a big database. So we have big data store and essentially it manages all the data volumes.”
Getting development in-house in order
The challenge then for a buy-side trading desk is to find the budget and skill sets if building these technologies in house.
“We have a dedicated team in TS, called client integration services,” says Corbet. “This team works with the buy side and the sell side to normalise all data sources and message inputs at point of entry, creating a consistent playbook for all TS stakeholders to make better-informed market moves.”
Accessing that talent pool at the current point in time will be hard for those investment firms without deep pockets – not only is MiFID II occupying much spend on human capital, the rise of the quantitative hedge fund is putting a strain on the available resources.
“The difficulty I see for a lot of the buy side is having access to the guys who can implement that technology,” says Simon Vincent, sales manager for Capital Markets at Software AG. “The role of ‘data scientist’ is the new ‘compliance professional’ in the sense of demand for skills and salaries”.
©TheDESK 2017
TOP OF PAGE