Wall Street's Quest To Process Data At The Speed Of Light
Financial firms use physical proximity to overcome the technical barriers of data latency
When the New York Stock Exchange celebrated its Euronext NV merger at the closing bell on April 4, the trading floor erupted in boos and jeers--and the ringing of cowbells.
The cowbells came from NYSE officials hoping to drown out the catcalls of traders, who see the merger as yet another threat to their livelihoods. With the spread of computerized trading--electronic trading now makes up 60% to 70% of the daily volume on the NYSE and algorithmic trading close to half of that--the manic floor traders look to be headed the way of the exchange's in-house barber (long gone). In the last year, the number of traders on the NYSE has dropped by a quarter.
Executing complex strategies based on arcane mathematical formulas, algorithmic trading systems generate thousands of buy and sell orders every second, many of which are canceled and overridden by subsequent orders, sometimes only a few seconds apart. The goal of these computer traders is to profit from minute, fleeting price anomalies and to mask their intentions via "time-slicing," or carving huge orders into smaller batches so as not to move the market.
A 1-millisecond advantage in trading applications can be worth $100 million a year to a major brokerage firm, by one estimate. The fastest systems, running from traders' desks to exchange data centers, can execute transactions in a few milliseconds--so fast, in fact, that the physical distance between two computers processing a transaction can slow down how fast it happens. This problem is called data latency--delays measured in split seconds. To overcome it, many high-frequency algorithmic traders are moving their systems as close to the Wall Street exchanges as possible.
Trying to stay ahead of electronic upstarts, the NYSE's Rubinow plots an all-automated future
Twiddling the knobs on these fantastically complicated, blindingly fast systems are human traders known as quant jocks (for quantitative trading tactics) and stat arbs (for statistical arbitrageurs). They work for major brokerages such as Credit Suisse and Merrill Lynch, or for little-known, publicity-shy hedge funds such as SAC Capital and Renaissance Technologies (both of which declined to comment for this story). They're competing to shave fractions of seconds off transaction times in order to pinpoint share prices.
Wall Street's quest for speed is not only putting floor traders out of work but also opening up space for new alternative exchanges and electronic communications networks that compete with the established stock markets. Electronic trading has reduced overall volatility in the equities markets, because volatility is a product of herd buying or selling, and electronic trading--responding instantaneously to tiny price fluctuations--tends to smooth out such mass behavior. And it has provided established exchanges with new revenue opportunities, such as co-location services for companies that wish to place their servers in direct physical proximity to the exchanges' systems. Electronic trading also has created opportunities for a new class of vendors--execution services firms and systems integrators promising the fastest possible transaction times.
At its most abstract level, the data-latency race represents the spear point of the global movement to eradicate barriers--geographic, technical, psychological--to fair and transparent markets. "Any fair market is going to select the best price from the buyer or seller who gets their order in there first," says Alistair Brown, founder of Lime Brokerage, one of the new-school broker-dealers, which uses customized Linux servers to trade some 200 million shares a day. "At that point, speed definitely becomes an issue. If everyone has access to the same information, when the market moves, you want to be first. The people who are too slow are going to get left behind."
That reality extends beyond Wall Street. While no other industry depends so much on split-second response times and high throughput between many different agents, the need for speed is growing in other sectors. Rich-media companies (those involved in video postproduction, digital animation, broadcasting, and Web 2.0), oil and gas producers, big retail chains, research institutions--they're all finding that rapid access to data is increasingly a competitive differentiator.
[Interop ITX 2017] State Of DevOps ReportThe DevOps movement brings application development and infrastructure operations together to increase efficiency and deploy applications more quickly. But embracing DevOps means making significant cultural, organizational, and technological changes. This research report will examine how and why IT organizations are adopting DevOps methodologies, the effects on their staff and processes, and the tools they are utilizing for the best results.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.