When 500 Milliseconds Is Too Long
Traditional market data consolidators were once an absolute necessity for financial services firms to receive real-time data and news from information providers. But the data available have increased exponentially since these vendors started peddling their wares in the 1970s. Those vendors now have to "throttle" the data coming in so it can fit down their electronic pipelines and be seen by the human eye. This creates latency issues, particularly with firms using real-time data for automated trading systems. To get around the problem, many larger investment banks and hedge funds have established direct feeds from relevant exchanges to receive the full breadth of data on a timely basis. This creates issues in and of itself, as the data has to be managed, often requiring teams of people. As a result, there are new kinds of software solutions being created to enable firms to receive and manage multiple real-time direct feeds. Today data is less the domain of human "eyeball users" and more about the algorithmic-based trading done by major financial institutions, said Daniel Reinmund, VP of operations at InfoDyne Corp., which provides solutions including a ticker plant for data and object-based technology. "With today's high rates of updating, some exchanges peak at 20,000 updates per second. Humans can't process that amount of information," said Reinmund. But electronic trading systems and applications can--so firms are turning to direct exchange feeds. The real issue is the delays on market open and close, said Frank Guerrera, COO at HyperFeed Technologies, which provides software and data management solutions. "The increase in messages per second over the past few years has caused a latency issue of 500-750 milliseconds on open and close, with some delays mounting to seconds." |