Wall Street's Quest To Process Data At The Speed Of Light
Financial firms use physical proximity to overcome the technical barriers of data latency
When the New York Stock Exchange celebrated its Euronext NV merger at the closing bell on April 4, the trading floor erupted in boos and jeers--and the ringing of cowbells.
The cowbells came from NYSE officials hoping to drown out the catcalls of traders, who see the merger as yet another threat to their livelihoods. With the spread of computerized trading--electronic trading now makes up 60% to 70% of the daily volume on the NYSE and algorithmic trading close to half of that--the manic floor traders look to be headed the way of the exchange's in-house barber (long gone). In the last year, the number of traders on the NYSE has dropped by a quarter.
- Government Analytics: Set Goals, Drive Accountability and Improve Outcomes
- 2012 IBM Chief Information Security Officer Assessment
Executing complex strategies based on arcane mathematical formulas, algorithmic trading systems generate thousands of buy and sell orders every second, many of which are canceled and overridden by subsequent orders, sometimes only a few seconds apart. The goal of these computer traders is to profit from minute, fleeting price anomalies and to mask their intentions via "time-slicing," or carving huge orders into smaller batches so as not to move the market.
A 1-millisecond advantage in trading applications can be worth $100 million a year to a major brokerage firm, by one estimate. The fastest systems, running from traders' desks to exchange data centers, can execute transactions in a few milliseconds--so fast, in fact, that the physical distance between two computers processing a transaction can slow down how fast it happens. This problem is called data latency--delays measured in split seconds. To overcome it, many high-frequency algorithmic traders are moving their systems as close to the Wall Street exchanges as possible.
Trying to stay ahead of electronic upstarts, the NYSE's Rubinow plots an all-automated future
Wall Street's quest for speed is not only putting floor traders out of work but also opening up space for new alternative exchanges and electronic communications networks that compete with the established stock markets. Electronic trading has reduced overall volatility in the equities markets, because volatility is a product of herd buying or selling, and electronic trading--responding instantaneously to tiny price fluctuations--tends to smooth out such mass behavior. And it has provided established exchanges with new revenue opportunities, such as co-location services for companies that wish to place their servers in direct physical proximity to the exchanges' systems. Electronic trading also has created opportunities for a new class of vendors--execution services firms and systems integrators promising the fastest possible transaction times.
At its most abstract level, the data-latency race represents the spear point of the global movement to eradicate barriers--geographic, technical, psychological--to fair and transparent markets. "Any fair market is going to select the best price from the buyer or seller who gets their order in there first," says Alistair Brown, founder of Lime Brokerage, one of the new-school broker-dealers, which uses customized Linux servers to trade some 200 million shares a day. "At that point, speed definitely becomes an issue. If everyone has access to the same information, when the market moves, you want to be first. The people who are too slow are going to get left behind."
That reality extends beyond Wall Street. While no other industry depends so much on split-second response times and high throughput between many different agents, the need for speed is growing in other sectors. Rich-media companies (those involved in video postproduction, digital animation, broadcasting, and Web 2.0), oil and gas producers, big retail chains, research institutions--they're all finding that rapid access to data is increasingly a competitive differentiator.