In the middle of last year, David Meitz began to plan for the growing demands of the asset managers and hedge funds that are his agency brokerage's customers, as they grew increasingly active in rapidly multiplying European markets.

Meitz is the managing director and chief technology officer at Investment Technology Group (ITG), based in New York. The independent agency broker and technology provider operates around the globe, with 18 offices in 10 countries, including the United Kingdom, Ireland and Spain. And since the European Union instituted the Markets in Financial Instruments Directive (MiFID) on November 1, 2007, a wide variety of new electronic trading venues, known as multilateral trading facilities (MTFs), have launched, bringing fragmentation to the purchase and sale of shares.

These venues have included BATS Europe, Instinet's Chi-X Europe, Nasdaq OMX Europe and Turquoise, created by nine investment banks, including Credit Suisse and Goldman Sachs. Chi-X Europe, for instance, now accounts for 13% of all electronic trading in European shares, more than the London Stock Exchange, according to Thomson Reuters.

While MiFID's stated purpose was to foster greater competition and encourage best execution by opening up the European market to new, alternative trading venues, ITG's customers, actively trading in Europe, found themselves contending with a barrage of new market data, coming from new trading venues using state-of-the-art, low-latency technology.

As the MTFs in Europe proliferated, ITG started to hear a constant refrain: "Find a way for me to have a consolidated view-in real-time-of all the trade data coming in from the new trading venues in Europe-in particular the new MTFs."

Asset and fund managers wanted the equivalent of a virtual, consolidated quote. This would allow them to more intelligently take advantage of the lower costs touted by the new trading outlets, the arbitrage opportunities they presented as well as the anonymity of dark pools which did not publicly quote bids and offers.

The message to Meitz was clear: "I needed a way to quickly address all the MTF data, given the fragmentation of the EU market; After the MiFID regulation in Europe, it became apparent that this [the push for a consolidated quote] would be an issue for us," he said. Which is where using complex event processing technology to make sense of - and consolidate - the deluge of data from all these venues came in to the picture.

Meitz' goal was to provide a competitive advantage to ITG's customers, particularly those who used its Triton EMS software, which offers direct market access to global markets along with algorithmic strategies and pre-and post-trade analytics. That meant creating a consolidated quote in real-time, for major stocks, while not inducing latency in the firm's trading platforms in the process.

Living Streams of Data

According to Phillip Silitschanu, a senior analyst at Aite Group, ITG is the first firm to consolidate MTF data in real-time, a feat announced in October. He notes that while Thomson Reuters has released a product called Equity Market Share Reporter that aims to consolidate such data, it does not currently do so in real-time. "ITG is now offering its Triton customers a real-time service that empowers the buy side to make better trading decisions," says Miranda Mizen, a principal with Tabb Group who points out that many European traders have complained that there is no consolidated tape for traders.

Before the European Union's directive became effective, ITG's existing systems were dependent on market data providers that had not yet developed a consolidated feed in real-time for MTFs and their planned dates for execution were beyond what was acceptable to ITG.

Meitz began to consider complex event processing technology - notable for its ability to analyze many simultaneous live streams of data, consolidate data in real-time and to detect trade and liquidity-related patterns - to address the problem. Chicago-based Aleri, a real-time analytics firm which specializes in complex event processing, became ITG's partner in the effort, since Meitz had already worked with Aleri in addressing other technology challenges.

"Our prior use of complex event processing technology within our own enterprise [for the monitoring of algorithm performance] had prompted us to talk to Aleri and three other providers of CEP services," in late 2007 and early 2008, Meitz said. The firm tested all four and ultimately selected the Aleri platform.

The goal was to create a dashboard for use by the firm's client support teams to allow them to monitor in real-time the performance of the firm's suite of algorithmic servers, to help monitor algo execution performance and to stay on top of any system latency issues.

"The algo dashboard allows us to ensure that our servers are tuned for the best possible performance for our clients," Meitz said. And having successfully completed that project, Meitz determined that Aleri was the right firm to address its effort to consolidate and analyze full "market depth" in Europe's fragmented marketplace.

Aleri committed time and significant technical talent to the ITG project. "Aleri did not just provide a product,'' said Meitz. "They took time to understand what we were trying to build and helped ensure we got it right.''

Test of Time

Aleri even delayed the rollout of an engine it was creating that would allow instant analysis of available buyers and sellers. "We had already developed a market liquidity analysis (MLA) engine,'' said Aleri CEO Don DeLoach. "But we saw that [ITG was] going to push hard with the use of the product and so, we decided to cease most of our efforts to market the product elsewhere and make sure that we got the ITG project right."

Before the project began, the capabilities of Aleri's CEP platform were tested by the Securities Technology Analysis Center (STAC), an independent testing lab. The STAC test showed at the time that the Aleri CEP system could take in as many as 180,000 messages from the NYSE and Nasdaq order books and in a matter of 1.2 to 1.5 thousandths of a second, combine them into a single book with all orders at each price level aggregated to show total size at price.

Other factors also favored the selection of Aleri for this project. Jeff Wootton, vice president of product strategy at Aleri, notes that order book information is sent out by exchanges and MTFs using three different message types: new order, change to an order and an order cancellation. Aleri's engine has the ability to instantly recognize "update" and "delete" trading operations, whereas other CEP engines treat all incoming messages the same, requiring the addition of coding logic. Wootton says that "because Aleri does update and delete operations automatically at lower levels of the code, it's a much more efficient system," for any effort aiming to create a virtual order book.

Mike Gualtieri, a senior analyst with Forrester Research and co-author of an assessment report of nine CEP platforms, notes Aleri has handled the intricacies of capital markets for nearly 20 years.

In complex event processing, Aleri, Progress Apama and Streambase were among the early, commercial players. But, in particular, the Aleri CEP architecture can handle the extreme data volumes that can occur in fragmented markets or in algorithmic trading. Gualtieri also cited the fact that Aleri provides users with an SQL-based query language that is very familiar to many developers and a scripting language-called SPLASH-that allows for quick coding of more specialized activities such as the initiation of a trade or guidance for smart order routing activities. "Some CEP platforms just detect the pattern, but Aleri can also act upon the detection of a pattern in a very rich way," Gualtieri said.

In deploying Aleri's MLA engine for its infrastructure revamp in Europe, the initial goal has been to consolidate basic market data such as bid price and size, ask price and size and last price and size, known as Level 1 data. "Level 1 consolidation is actually a more complex task from a logic perspective than Level 2 consolidation," explains Wootton, "because you have more data fields and have to have logic to handle things like trade conditions and auctions."

ITG, working with Aleri, accomplished its goal of Level 1 data consolidation by employing numerous feed handlers to manage incoming market data from various exchanges and then sending information back out through a quote distributor. That distributor provides consolidated price and size data to various ITG applications. Aleri's liquidity engine sits between the feed handlers and the quote distributor, to capture, aggregate and normalize the consolidated stream of data.

The engine can identify where the best prices are, where a market is moving and can even help filter out quotes in real-time when there are auctions underway and the bid and offer prices cited don't represent actual executable orders.

By next year, Meitz says, the firm plans to use Aleri's engine to consolidate Level 2 data as well. This would include the highest bid prices; the number of contracts that are available at each of the highest bid prices; the lowest ask prices and the number of contracts that are available at each of the lowest ask prices. It would then achieve a virtual, consolidated order book, which is known as "depth of market."

A consolidated book and a consolidated quote, in one fell swoop.

This article can also be found at SecuritiesIndustry.com.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access