Tech

Supercomputers VS High-Frequency Trading: Analysis Leads to New Regulatory Law

Staff Reporter
First Posted: Nov 20, 2013 02:20 PM EST

When there’s money to be made, most loopholes have already been explored and exploited – but emerging technologies are changing the game, creating new opportunities for moneymakers and headaches for regulators. To counter this, researchers are now using supercomputers to help regulators determine which policy changes will ensure a fairer, more stable market.

With the advent of high-frequency trading, traders can use superfast computers – essentially supercomputers of their own – to compete in the market, taking advantage of brief price differences to clean up on profit. For example, if a stock is momentarily priced slightly lower in New York than in London, high-frequency traders can almost instantaneously buy and sell for risk-free returns.

Such rapid-fire trading – trading often completed in microseconds, or even nanoseconds – can lead to market instability, and regulators haven’t been able to keep up with these shenanigans of high-frequency traders. This volume makes it significantly harder to identify the root cause of a problem. For example, when the Dow Jones fell nearly 1,000 points in 20 minutes in the ‘Flash Crash’ of May 2010, it took US Securities and Exchange Commission regulators five months to analyze the data and figure out what happened. As it turns out, the culprit was flawed automated trading software.

However, sometimes it’s not a software glitch that’s causing the problem. Sometimes high-frequency traders are causing market instability on purpose with a practice called ‘quote stuffing.’ Essentially, a high-frequency trader places an order only to cancel it in .001 seconds or less, solely to cause congestion. The trouble: It all happens so quickly that it’s hard to prove. That’s where supercomputers can help.

“It’s very hard to do research on high-frequency trading,” says Mao Ye, assistant professor of finance at the University of Illinois at Urbana-Champaign, US, and one of the collaborators on the project. “There are people doing high-frequency trading research who do not use supercomputers, but there are some constraints. Either it takes a long time, or you can only deal with aggregate-level data or data gathered before nanosecond trading was possible. Because data now increases exponentially, we’re using supercomputers to study high-frequency trading at the nanosecond level.”

The heroes in this story are Blacklight and Gordon, supercomputers at the Pittsburgh Supercomputing Center (PSC) in Pennsylvania, US, and the San Diego Supercomputer Center (SDSC) in California, US, respectively. Both are part of an XSEDE shared memory resource. Ye, Chen Yao, and Jiading Gai, all at the University of Illinois, used Blacklight and Gordon to process NASDAQ (American stock exchange) trade data to study the effects of ever-increasing trading speeds. They were able to find evidence consistent with quote stuffing, which could lead regulators to pursue policy changes to make the market a fairer place.

“There are two ways to be faster than other people. Either you become faster or you slow down others,” Ye says. “There’s one thing we know for sure. Lots of people generate lots of cancellations, but whether it’s quote stuffing or not deserves further investigation.”

Splitting up trades is another high-frequency trading practice to make money while evading regulators. The trades and quotes (TAQ) ticker that tracks trades moment by moment is only concerned with trades of 100 shares or more. Some high-frequency traders with hot tips are doing an end run around the system and submitting several bids, each fewer than 100 shares. These trades are known as ‘odd lots.’ Because odd lots don’t show up on the TAQ ticker, no one knows they’re trading quietly in the background, which skews perceptions of the market. Odd lots were once less than 1% of the market – so small they were not worth tracking on the ticker – but now they make up nearly 20% of market activity.

Ye, Yao, and Maureen O’Hara, a professor at Cornell University in Ithaca, New York, US, used Blacklight and Gordon to analyze massive amounts of market data. Looking at two years of trading, they explored how lack of transparency in odd-lot trades could impact market perceptions. This analysis led to a policy change. As of October 2013, odd-lot trades are reported on the TAQ ticker in an effort to ensure transparency.

SDSC and PSC programmers are working together to optimize the code Ye and his colleagues developed to use Blacklight and Gordon for market analyses. Near real-time analysis of market data is now possible, potentially allowing regulators to keep up with the superfast market movements brought about by high-frequency trading. -- by Amanda Aubuchon, © i SGTW

See Now: NASA's Juno Spacecraft's Rendezvous With Jupiter's Mammoth Cyclone

More on SCIENCEwr