High-Frequency Trading – Blink an Eye and They Are Thousand Steps Ahead
On 13 September 2012 speaking at the regulatory conference of the European Commission, Ewald Nowotny, head of the Austrian National Bank and a prominent policymaker for the European Central Bank (ECB), proposed imposing clear bans on a number of controversial financial market activities, including high-frequency trading (HFT):
“… with high-frequency trading there is nothing to be regulated, it is to be banned. There is no really demonstrable net advantage from this [form of trading],”
Are you looking for fast-news, hot-tips and market analysis? Sign-up for the Invezz newsletter, today.
His remarks follow a more widespread backlash against high-frequency trading, which is perceived as unfair by many manual traders and blamed for causing sudden market crashes and creating a false illusion of market liquidity.
Several influential members of the European Parliament (MEPs) have expressed determination to clamp down on the use of the sophisticated computer algorithms, which are exploited to generate profits through massive numbers of high-speed trades.
What is high-frequency trading?
HFTs foundation lies on a single concept – speed. It uses the enormous capacity of supercomputers to perform trades in a matter of microseconds (one-millionth of a second). These software programs have the capacity to hold short-term positions on every single financial instrument traded electronically (e.g. Credit Default Swaps (CDS) are not traded electronically, which renders them incompatible with investment algorithms), but the majority of financial instruments from forex to equities are. High-frequency trading or “algo” trading aims to capture tiny gains, sometimes just fractions of a penny, out of short term market distortions. These tiny gains accumulate as the number of trades performed grows and by the end of the day can amount to significant returns.
How does high-frequency trading work?
There are a number of different strategies which employ algos to gain an advantage in the financial markets. Imagine there is an institutional investor who is willing to buy 100,000 shares of XYZ, currently priced at £14.50/share. Because the amount is substantial, the investor wouldn’t want to simply storm the market with one large order as that would be the equivalent of a poker player going all-in, while revealing everyone his cards. The institutional investor prefers to create an algorithm which splits the 100,000 shares in a thousand stacks of 100 shares buy orders and sets a maximum buying price of £14.70/share. Now the algorithm starts dishing out the packets – at £14.50 or at £14.60 but below its limit price of £14.70. Another HFT, the so-called automated market maker, sees the first trade of 100 shares for £14.50 a piece and the subsequent trade made at £14.60 and recognizes there is a large investor buying in bulk. In a matter of fractions of a second the HFT will start pumping out small orders to try and figure out what the limit price on the orders of this large buyer is. It might probe at £15, which will get rejected and the HFT will cancel it immediately, pitching another one at £14.80 and once again cancelling it until it gets to £14.70 and realizes a transaction. At this point the automated market maker has found out the limit price and buys all shares available of XYZ and sells them to the institutional investor at £14.70/share. Those traders, who are not using HFTs, will be left out of the action as they can’t possibly match the speed of algo trading.
In the aftermath, the institutional investor has succeeded buying his 100,000 shares at £14.70 – he was willing to pay that price but could have gotten better offers if it wasn’t for the HFT. The automated market maker forced the price upward to the limit set by the investor’s algorithm
Why care about HFTs?
Because according to recent studies, trading by “real investors” in 2012 takes up the smallest share of US stock market volumes in over a decade, while high-frequency trading accounts for up to 70 percent. This means the market participants, who are taking conscious trading decisions based on actual information about the assets they are buying and selling, are in fact a small minority overwhelmed by algos. This leads to markets drifting away from fundamentals and becoming more volatile and less efficient.
The “flash crash” of 2010, Knight Capital’s mess up, BATS and Facebook IPOs all proved that glitches in the high-frequency trading systems are a reality and hold the potential to seriously damage markets.
“One of the things that we have focused on . . . for the last five years is the extent on which the global financial markets are now essentially a single, planetary-wide, ultra-large scale complex IT system . . . The 6th May Flash Crash was the first real sign that actually our concern was justified, that events could happen at an unprecedented scale, in terms of the magnitude of the drop and the speed at which it happened.”
said Dave Cliff, a computer science professor at the University of Bristol, in an interview with HFT Review.
According to data research company Nanex, HFTs usage gives birth to a number of malicious practices, which are extremely difficult to investigate by the regulatory authorities. One of these practices is known as Quote Stuffing and it involves “slowing down” some computers or networks by sending a higher-than-usual number of orders at crucial times. Another way is to send fake interest in a stock by rapidly entering orders and changing the quote price, all the while cancelling those orders before anyone can execute anything against them.
With trades happening on microseconds levels, things can get extremely complicated to monitor, which makes HFTs not simply non-transparent but almost invisible. Nanex gives a couple of examples of what it means to have 1,000 symbols and 1 quote per microsecond worth of data to examine. Ten seconds worth of market data will take up one terabyte hard disk to store, meaning you will need 2,000 of those disks to store one day of trading. To get market pricing information, a terabit network would be required, which is currently considered as leading edge technology. And finally, a human’s lifespan would be needed for the Securities Exchange Commission (SEC) to investigate an hour’s worth of data. It comes as no surprise that as of today, no one is exactly sure what caused the 2010 Flash Crash.
High Frequency Trading is also considered to harm Long Term Investors in a number of ways:
- Misleading price quotes mess with price discovery, a core function of a stock exchange
- Investors are discouraged from using market and stop orders because HFTs can and will suddenly withdraw their quotes
- Much wider quote spreads during market open cause many micro flash crashes in individual stocks
- Some business schools such as Georgetown can no longer afford to buy TradeandQuote (TAQ) data for their students or professors to analyze. Data has become too expensive because of all the excess quotes generated by the HFTs.
- In the flash crash there was so much spam from HFTs’ generated quotes that it took the SEC five months to assemble the data. Excessive quotes also led to overloaded data feeds, causing severe delays – on some exchanges stock quotes were behind more than 30 seconds.
There is a way for institutional investors and large traders to escape the predatory nature of the HFTs, through the sinisterly named “dark pools”. These over-the-counter systems are distanced from exchanges and provide their clients with the opportunity to trade completely anonymously – they don’t give away the identity, the size of the trades or even whether the orders are buy or sell. Investors can transfer large blocks of shares without fear the market will move against them or a bunch of HFTs will jump and chew their orders up. An institutional investor who wants to buy 100,000 shares of XYZ no longer needs to worry about employing algorithms to do the work – he simply goes to the dark pools and releases the buy order, which will be matched only if there is a corresponding sell order. The only issue is that for retail investors these alternative trading platforms are not easily accessible and brokers who have access to them charge some hefty fees.
There are many who believe HFTs are beneficial to the markets and their practices are only malicious when traders want them to be. According to the algos lobbyists, the HFTs add liquidity and reduce spreads in the markets. They also supposedly aid in price discovery and lower transaction costs for all market participants.
Jim Overdahl, vice-president in the National Economic Research Associates’ (NERA’s) Securities and Finance Practice, argues that HFTs contribute to the overall quality of the markets by strengthening the informational linkages. High-frequency trading will quickly locate any price discrepancies across related products and markets and trade upon them, restoring price synchronization. Mr Overdahl claims that despite being used primarily by professional traders, HFTs benefit all market participants, including average investors. The intense competition between the HFTs reduces transactions costs, which leaves the regular investor with less money taken out of his pocket to cover these costs and higher investment returns. According to the Vanguard Group, a mutual fund which does not employ HFT techniques, high-frequency trading has resulted in significant cost savings for long-term mutual fund investors.
A different touch
Kevin Slavin, co-founder of the game development company Area/Code, comes up with a great parallel between the algos running wild on the stock markets and the ones bidding on items on Amazon.com. He presents the case of the Peter Lawrence’s book The Making of a Fly – a classic work in developmental biology, which is currently out of print. On Amazon.com this book seemed to have been so rare that it was listed for $1,730,045.91 (+$3.99 shipping). On the next day, both copies have gone up to $2.8 million. A week later, the books reached the staggering price of $23,698,655.93 (plus $3.99 shipping).
What happened there? Michael Eisen, an evolutionary biologist at UC Berkeley and the person who stumbled upon the book on Amazon, attempted to explain the phenomenon. According to him there were two competing sellers who were setting the prices on their own copies of the book– “profnath” and “bordeebook”. The first time the biologist noticed the prices, bordeebook’s was $400,000 higher than profnath’s. On the next day the prices were within $5000 of each other. Mr Eisen got curious and began following the page intensely. Only a day later he managed to see a pattern – the first time bordeebook’s price was 1.270589 times higher than profnath but after a short while, profnath’s price was already 0.99830 of bordeebook’s. Apparently, the two Amazon retailers have employed algorithm software to make sure their prices always stay either ahead or behind of the competitor’s, depending on their preference. Bordeebook is a large seller with a high rating on Amazon so they were confident a potential customer would choose them and set their algorithm to outprice other sellers by 1.270589 times. Proftnah, a less popular retailer, decided to rely on a lower price so their algorithm undercuts the other prices by setting it to 0,99830 of the competitor’s. The two softwares clashed and went into a never-ending loop, pushing the price of “The Making of a Fly” to the absurd $23 million.
This absolute fiasco shows what can happen when there is no “sanity” check and no human oversight. It is true the algos used on stock exchanges are much more sophisticated but they are also a lot faster meaning they can spiral out of control in a matter of fractions of a second.
Another controversial topic raised by Mr Slavin is co-location. To be the fastest, HFT servers need to be located a close as possible to the signal emitter of market information. Building floors are literally emptied out and filled with steel just so that companies gain a 5 microseconds advantage ahead of others. Fiber optic cables are laid across the world to accommodate these speed races between the HFTs.
“We’re running through the United States with dynamite and rock saws so that an algorithm can close the deal three microseconds faster, all for a communications framework that no human will ever know; that’s a kind of manifest destiny.”
As of 2012 financial services firms are hitting the physical boundaries for information transmission through the existing fiber-optic routes. But there are new opportunities being discovered – in November 2010, Alex Wissner-Gross, a fellow at the Harvard University Institute for applied computational science, and Cameron Freer, a research affiliate at Massachusetts Institute of Technology published a paper, which explored the optimal geographical locations for trading servers all around the world…including in the middle of the Atlantic and Pacific oceans, where there are several hotspots located. It will be extremely challenging and absurdly expensive, yet companies will eventually do it.
This summer, the first ever trans-Arctic Ocean submarine fiber-optic cable was laid costing around $1.5 billion – it will reduce the time a packet of data travels from London to Tokyo from 230 to 170 milliseconds. This will supercharge some existing algos but the question is,was there a more socially useful way to spend this capital?
Hate them or love them, HFTs are dominating. It is the sole responsibility of regulators to spend more time and money on understanding the algos, and to impose the necessary restrictions, which will guarantee fair trading on the markets.