Dreams of Paris, and Anomalies and TCA

30 November 2022 |
Notice: Undefined variable: post in /var/www/vhosts/kx.com/devweb.kx.com/wp-content/themes/eprefix-bootstrap/functions.php on line 18

Notice: Trying to get property 'ID' of non-object in /var/www/vhosts/kx.com/devweb.kx.com/wp-content/themes/eprefix-bootstrap/functions.php on line 18
5 minutes

 

by Alex Weinrich, Financial Solutions and FX Specialist, KX

I recently had the pleasure of attending the 60th ACI World Congress at the historic Palais Brongniart in Paris. The underlying theme of the day was, without question…Data! Data was the driving force behind the current economic climate discussions, with algorithms an essential part in delivering relevant insights into the workflows for buy and sell side participants. Here are some of the main themes that we discussed.

Can past data prevent future crises? The general consensus among speakers was a resounding optimistic YES, YES, YES!!!. The financial community leverages data faster and smarter than ever before. Algorithms and Machine Learning warn us of potential risks, picking up anomalies in real-time. First-rate technology stacks matter, regardless of our role in capital markets or our maturity.

Whether pre-trade, intra-trade, and/or post-trade, data processing, and analytics are the lifeblood of the financial markets. Speaker after speaker spoke about data’s importance in evaluating liquidity, making trading decisions, and driving algorithms. Quantitative analysts and traders require access to vast amounts of historical and real-time data for their backtesting models and to supplement many analytic workflows in trading, risk management, and beyond. For pre-and post-trade analysis, data informs decisions relating to:

  • When should I trade, and with whom?
  • Does the risk transfer price justify the (potential) outcomes?
  • Does my trading comply with my Best Execution policy?
  • Does my current back-testing environment confirm I trade efficiently and prudently?
  • What is my control framework?

An interesting conversation with a quant on backtesting and deploying strategies, highlighted liquidity, venue, and pool analysis as ongoing concerns among both buy and sell-side firms. Because data is fired continuously from multiple sources, ingesting it into one consolidated system is still a challenge – often making it untimely and expensive. Developers and quants use multiple languages, Python most often, while data formats and types vary. This means delayed or missing data often rendering it inconsistent to what is deployed in the production environment. I’m stating the obvious, but back-testing should run where data gets captured, however too many technology stacks can’t accommodate this shared workflow.

Market Impact was another major theme at the Congress. The buy-side has become increasingly focused on TCA and liquidity analysis, so the subject is no longer a sell-side-only concern  Bigger buy-sides are continuously optimising their trading strategies, using historical trade and order data to build strong “partnerships” with their liquidity providers. Whether via algos or dark pools, those firms today can use past experiences to generate better outcomes. Once again, quality data and back-testing are fundamental to best-execution. “Evolution, not revolution.” Firms today constantly analyze and evaluate their strategies, seeking alpha or protecting capital, and optimizing liquidity relationships.

ESG was also touched upon, and is one that I predict will receive growing attention. Although many workflows in FX and Crypto need to be low latency, there is also much that can be done to keep sustainable software engineering in focus. Firms need to be more concerned with building applications that are carbon efficient, lowering carbon intensity if we want to achieve net-zero emissions together. Demand shifting is the strategy of moving compute to regions, or times, when the carbon intensity is less, or to put it another way, when the supply of renewable electricity is high. We can also reduce the amount of data and distance it needs to travel across networks. These are just a few of the disciplines financial participants can practice in an attempt to optimize solutions and help reach for (in step with the Congress) the Paris Agreement Goals on climate change.

There was no sense of competition in the air at this event, just refreshing cooperation, or perhaps co-opetition is the right term! Firms compete and cooperate. Technology providers want to help market participants solve problems. Faster and easier access to data has become a necessity. Real-time analytics are the norm.

Challenge yourself and your firm! How can you achieve your goals? Do you have the right technology in place? Can you help the world achieve a carbon-neutral status? The technology landscape to solve across these complex challenges is largely fragmented and diffuse, meaning technology and data teams often have to revert to binding together many different technologies to solve their business problem; this leads to long development and deployment times and high project failure rates.

Let us know if we can help sales@devweb.kx.com.

KX has long been at the forefront of financial innovation, particularly in trading, risk management and surveillance providing technology for real-time analysis of any data, at speed or at rest, providing firms with actionable insights. Kdb Insights Enterprise is its integrated data management and streaming analytics platform that provides that ability. Advanced analytics, with seamless Python integration and SQL querying of data, allows users to detect anomalies and derive insights – all in real-time.

 

Demo kdb, the fastest time-series data analytics engine in the cloud








    For information on how we collect and use your data, please see our privacy notice. By clicking “Download Now” you understand and accept the terms of the License Agreement and the Acceptable Use Policy.