An automated trading system (ATS), a subset of algorithmic trading, uses a computer program to create buy and sell orders and automatically submits the orders to a market center or exchange. The computer program will automatically generate orders based on predefined set of rules using a trading strategy which is based on technical analysis, advanced statistical and mathematical computations or input from other electronic sources.
Automated trading systems are often used with electronic trading in automated market centers, including electronic communication networks, “dark pools”, and automated exchanges. Automated trading systems and electronic trading platforms can execute repetitive tasks at speeds orders of magnitude greater than any human equivalent. Traditional risk controls and safeguards that relied on human judgment are not appropriate for automated trading and this has caused issues such as the 2010 Flash Crash. New controls such as trading curbs or ‘circuit breakers’ have been put in place in some electronic markets to deal with automated trading systems.
2 Advantages of Automated Trading System
3 Disadvantages of Automated Trading System
7 Market disruption and manipulation
7.1 Notable examples
8 Also See Below
The automated trading system determines whether an order should be submitted based on, for example, the current market price of an option and theoretical buy and sell prices. The theoretical buy and sell prices are derived from, among other things, the current market price of the security underlying the option. A look-up table stores a range of theoretical buy and sell prices for a given range of current market price of the underlying security. Accordingly, as the price of the underlying security changes, a new theoretical price may be indexed in the look-up table, thereby avoiding calculations that would otherwise slow automated trading decisions. A distributed processing on-line automated trading system uses structured messages to represent each stage in the negotiation between a market maker (quoter) and a potential buyer or seller (requestor).
Advantages of Automated Trading System
Minimizes Emotion 
As orders are processed automatically once the pre-set rules are satisfied, emotional mistakes are minimized. It also helps traders to stay disciplined when the market is highly volatile.
Ability to Backtest 
Before actually using the automated trading or the underlying algorithm, traders are able to evaluate their rules using the old data. It allows the traders to minimize potential mistakes and determine the expected returns.
Achieves Consistency 
As orders are processed only when the pre-set rules are satisfied and traders only trade by plan, it helps the traders achieve consistency.
Improved Order Entry Speed 
As computers process the orders as soon as the pre-set rules are met, it achieves higher order entry speed which is extremely beneficial in the current market where market conditions can change very rapidly.
Diversifies Trading 
Automated trading systems allow users to simultaneously trade in multiple accounts which allows them to diversify their portfolio. Diversifying the portfolio allows the users to minimize their risks by spreading the risk over various instruments.
Disadvantages of Automated Trading System
Mechanical Failures 
Even though the underlying algorithm is capable of performing well in the live market, an internet connection malfunction could lead to a failure.
Although the computer is processing the orders, it still needs to be monitored because it is susceptible to technology failures as shown above.
An algorithm that performs very well on backtesting could end up performing very poorly in the live market. Good performance on backtesting could lead to overly optimistic expectations from the traders which could lead to big failures.
Trend Followings (7)
“The most common strategy which is implemented by following the trend in moving averages, channel breakouts, price level movements, and related technical indicators”. (7)
For example, the following formula could be used for trend following strategy:
Volume-weighted average price (7)
“Volume weighted average price strategy breaks up a large order and releases dynamically determined smaller chunks of the order to the market using stock-specific historical volume profiles.” (7)
According to Volume-weighted price, VWAP is calculated using the following formula:
Mean reversion (finance) (7)
This strategy is based on the idea that the values/prices of assets will revert to their mean prices/values.
“A continuous mean-reverting time series can be represented by an Ornstein-Uhlenbeck stochastic differential equation:
Where is the rate of reversion to the mean, is the mean value of the process, is the variance of the process and is a Wiener Process or Brownian Motion”.
The concept of automated trading system was first introduced by Richard Donchian in 1949 when he used a set of rules to buy and sell the funds. Then, in the 1980s, the concept of rule based trading became more popular when famous traders like John Henry began to use such strategies. In the mid 1990s, some models were available for purchase. Also, improvements in technology increased the accessibility for retail investors. Early form of Automated Trading System, software based on algorithm, has been used by financial managers and brokers. These kinds of software were used to automatically manage clients’ portfolios. However, first service to free market without any supervision was first launched in 2008 which was Betterment by Jon Stein. Since then, this system has been improving with the development in the IT industry. Now, Automated Trading System is managing huge assets all around the globe. In 2014, more than 75 percent of the stock shares traded on United States exchanges (including the New York Stock Exchange and NASDAQ) originated from automated trading system orders.
How it Works
Automated trading system can be based on a predefined set of rules which determine when to enter an order, when to exit a position, and how much money to invest in each trading product. Trading strategies differ such that while some are designed to pick market tops and bottoms, others follow a trend, and others involve complex strategies including randomizing orders to make them less visible in the marketplace. ATSs allow a trader to execute orders much quicker and to manage their portfolio easily by automatically generating protective precautions.
Backtesting of a trading system involves programmers running the program by using historical market data in order to determine whether the underlying algorithm can produce the expected results. Backtesting software enables a trading system designer to develop and test their trading systems by using historical market data and optimizing the results obtained with the historical data. Although backtesting of automated trading systems cannot accurately determine future results, an automated trading system can be backtested by using historical prices to see how the system would have performed theoretically if it had been active in a past market environment.
Forward testing of an algorithm can also be achieved using simulated trading with real-time market data to help confirm the effectiveness of the trading strategy in the current market. It may be used to reveal issues inherent in the computer code.
Live testing is the final stage of the development cycle. In this stage, live performance is compared against the backtested and walk forward results. Metrics compared include Percent Profitable, Profit Factor, Maximum Drawdown and Average Gain per Trade. The goal of an automated trading system is to meet or exceed the backtested performance with a high efficiency rating.
Market disruption and manipulation
Automated trading, or high-frequency trading, causes regulatory concerns as a contributor to market fragility. United States regulators have published releases discussing several types of risk controls that could be used to limit the extent of such disruptions, including financial and regulatory controls to prevent the entry of erroneous orders as a result of computer malfunction or human error, the breaching of various regulatory requirements, and exceeding a credit or capital limit.
The use of high-frequency trading (HFT) strategies has grown substantially over the past several years and drives a significant portion of activity on U.S. markets. Although many HFT strategies are legitimate, some are not and may be used for manipulative trading. A strategy would be illegitimate or even illegal if it causes deliberate disruption in the market or tries to manipulate it. Such strategies include “momentum ignition strategies”: spoofing and layering where a market participant places a non-bona fide order on one side of the market (typically, but not always, above the offer or below the bid) in an attempt to bait other market participants to react to the non-bona fide order and then trade with another order on the other side of the market. They are also referred to as predatory/abusive strategies. Given the scale of the potential impact that these practices may have, the surveillance of abusive algorithms remains a high priority for regulators. The Financial Industry Regulatory Authority (FINRA) has reminded firms using HFT strategies and other trading algorithms of their obligation to be vigilant when testing these strategies pre- and post-launch to ensure that the strategies do not result in abusive trading.
FINRA also focuses on the entry of problematic HFT and algorithmic activity through sponsored participants who initiate their activity from outside of the United States. In this regard, FINRA reminds firms of their surveillance and control obligations under the SEC’s Market Access Rule and Notice to Members 04-66, as well as potential issues related to treating such accounts as customer accounts, anti-money laundering, and margin levels as highlighted in Regulatory Notice 10-18  and the SEC’s Office of Compliance Inspections and Examination’s National Exam Risk Alert dated September 29, 2011.
FINRA conducts surveillance to identify cross-market and cross-product manipulation of the price of underlying equity securities. Such manipulations are done typically through abusive trading algorithms or strategies that close out pre-existing option positions at favorable prices or establish new option positions at advantageous prices.
In recent years, there have been a number of algorithmic trading malfunctions that caused substantial market disruptions. These raise concern about firms’ ability to develop, implement, and effectively supervise their automated systems. FINRA has stated that it will assess whether firms’ testing and controls related to algorithmic trading and other automated trading strategies are adequate in light of the U.S. Securities and Exchange Commission and firms’ supervisory obligations. This assessment may take the form of examinations and targeted investigations. Firms will be required to address whether they conduct separate, independent, and robust pre-implementation testing of algorithms and trading systems. Also, whether the firm’s legal, compliance, and operations staff are reviewing the design and development of the algorithms and trading systems for compliance with legal requirements will be investigated. FINRA will review whether a firm actively monitors and reviews algorithms and trading systems once they are placed into production systems and after they have been modified, including procedures and controls used to detect potential trading abuses such as wash sales, marking, layering, and momentum ignition strategies. Finally, firms will need to describe their approach to firm-wide disconnect or “kill” switches, as well as procedures for responding to catastrophic system malfunctions.
Examples of recent substantial market disruptions include the following:
On May 6, 2010, the Dow Jones Industrial Average declined about 1,000 points (about 9 percent) and recovered those losses within minutes. It was the second-largest point swing (1,010.14 points) and the largest one-day point decline (998.5 points) on an intraday basis in the Average’s history. This market disruption became known as the Flash Crash and resulted in U.S. regulators issuing new regulations to control market access achieved through automated trading.
On August 1, 2012, between 9:30 a.m. and 10:00 a.m. EDT, Knight Capital Group lost four times its 2011 net income. Knight’s CEO Thomas Joyce stated, on the day after the market disruption, that the firm had “all hands on deck” to fix a bug in one of Knight’s trading algorithms that submitted erroneous orders to exchanges for nearly 150 different stocks. Trading volumes soared in so many issues, that the SPDR S&P 500 ETF (SYMBOL: SPY), which is generally the most heavily traded U.S. security, became the 52nd-most traded stock on that day, according to Eric Hunsader, CEO of market data service Nanex. Knight shares closed down 62 percent as a result of the trading error and Knight Capital nearly collapsed. Knight ultimately reached an agreement to merge with Getco, a Chicago-based high-speed trading firm.
ALSO SEE BELOW:
Day trading software
Technical analysis software
High-frequency trading (HFT) is a type of algorithmic financial trading characterized by high speeds, high turnover rates, and high order-to-trade ratios that leverages high-frequency financial data and electronic trading tools. While there is no single definition of HFT, among its key attributes are highly sophisticated algorithms, co-location, and very short-term investment horizons. HFT can be viewed as a primary form of algorithmic trading in finance. Specifically, it is the use of sophisticated technological tools and computer algorithms to rapidly trade securities. HFT uses proprietary trading strategies carried out by computers to move in and out of positions in seconds or fractions of a second.
In 2017, Aldridge and Krawciw estimated that in 2016 HFT on average initiated 10–40% of trading volume in equities, and 10–15% of volume in foreign exchange and commodities. Intraday, however, proportion of HFT may vary from 0% to 100% of short-term trading volume. Previous estimates reporting that HFT accounted for 60–73% of all US equity trading volume, with that number falling to approximately 50% in 2012 were highly inaccurate speculative guesses. High-frequency traders move in and out of short-term positions at high volumes and high speeds aiming to capture sometimes a fraction of a cent in profit on every trade. HFT firms do not consume significant amounts of capital, accumulate positions or hold their portfolios overnight. As a result, HFT has a potential Sharpe ratio (a measure of reward to risk) tens of times higher than traditional buy-and-hold strategies. High-frequency traders typically compete against other HFTs, rather than long-term investors. HFT firms make up the low margins with incredibly high volumes of trades, frequently numbering in the millions.
A substantial body of research argues that HFT and electronic trading pose new types of challenges to the financial system. Algorithmic and high-frequency traders were both found to have contributed to volatility in the Flash Crash of May 6, 2010, when high-frequency liquidity providers rapidly withdrew from the market. Several European countries have proposed curtailing or banning HFT due to concerns about volatility.
1.1 Market growth
1.2 Market share
2.1 Market making
2.3 Ticker tape trading
2.4 Event arbitrage
2.5 Statistical arbitrage
2.6 Index arbitrage
2.7 News-based trading
2.8 Low-latency strategies
2.9 Order properties strategies
3.1 May 6, 2010 Flash Crash
4 Granularity and accuracy
5 Risks and controversy
5.1 Flash trading
5.2 Insider trading
6 Violations and fines
6.1 Regulation and enforcement
6.2 Order types
6.3 Quote stuffing
6.4 Spoofing and layering
6.5 Market manipulation
7 Advanced trading platforms
8 See also
10 External links
11 Key words to search
High-frequency trading has taken place at least since the 1930s, mostly in the form of specialists and pit traders buying and selling positions at the physical location of the exchange, with high-speed telegraph service to other exchanges.
The rapid-fire computer-based HFT developed gradually since 1983 after NASDAQ introduced a purely electronic form of trading. At the turn of the 21st century, HFT trades had an execution time of several seconds, whereas by 2010 this had decreased to milli- and even microseconds. Until recently, high-frequency trading was a little-known topic outside the financial sector, with an article published by the New York Times in July 2009 being one of the first to bring the subject to the public’s attention.
On September 2, 2013, Italy became the world’s first country to introduce a tax specifically targeted at HFT, charging a levy of 0.02% on equity transactions lasting less than 0.5 seconds.
In the early 2000s, high-frequency trading still accounted for fewer than 10% of equity orders, but this proportion was soon to begin rapid growth. According to data from the NYSE, trading volume grew by about 164% between 2005 and 2009 for which high-frequency trading might be accounted. As of the first quarter in 2009, total assets under management for hedge funds with high-frequency trading strategies were $141 billion, down about 21% from their peak before the worst of the crises, although most of the largest HFTs are actually LLCs owned by a small number of investors. The high-frequency strategy was first made popular by Renaissance Technologies who use both HFT and quantitative aspects in their trading. Many high-frequency firms are market makers and provide liquidity to the market which lowers volatility and helps narrow bid-offer spreads, making trading and investing cheaper for other market participants.
In the United States in 2009, high-frequency trading firms represented 2% of the approximately 20,000 firms operating today, but accounted for 73% of all equity orders volume. The major U.S. high-frequency trading firms include Virtu Financial, Tower Research Capital, IMC, Tradebot and Citadel LLC. The Bank of England estimates similar percentages for the 2010 US market share, also suggesting that in Europe HFT accounts for about 40% of equity orders volume and for Asia about 5–10%, with potential for rapid growth. By value, HFT was estimated in 2010 by consultancy Tabb Group to make up 56% of equity trades in the US and 38% in Europe.
As HFT strategies become more widely used, it can be more difficult to deploy them profitably. According to an estimate from Frederi Viens of Purdue University, profits from HFT in the U.S. has been declining from an estimated peak of $5bn in 2009, to about $1.25bn in 2012.
Though the percentage of volume attributed to HFT has fallen in the equity markets, it has remained prevalent in the futures markets. According to a study in 2010 by Aite Group, about a quarter of major global futures volume came from professional high-frequency traders. In 2012, according to a study by the TABB Group, HFT accounted for more than 60 percent of all futures market volume in 2012 on U.S. exchanges.
High-frequency trading is quantitative trading that is characterized by short portfolio holding periods. All portfolio-allocation decisions are made by computerized quantitative models. The success of high-frequency trading strategies is largely driven by their ability to simultaneously process large volumes of information, something ordinary human traders cannot do. Specific algorithms are closely guarded by their owners. Many practical algorithms are in fact quite simple arbitrages which could previously have been performed at lower frequency—competition tends to occur through who can execute them the fastest rather than who can create new breakthrough algorithms.
The common types of high-frequency trading include several types of market-making, event arbitrage, statistical arbitrage, and latency arbitrage. Most high-frequency trading strategies are not fraudulent, but instead exploit minute deviations from market equilibrium.
Main article: Market maker
According to SEC:
A “market maker” is a firm that stands ready to buy and sell a particular stock on a regular and continuous basis at a publicly quoted price. You’ll most often hear about market makers in the context of the Nasdaq or other “over the counter” (OTC) markets. Market makers that stand ready to buy and sell stocks listed on an exchange, such as the New York Stock Exchange, are called “third market makers”. Many OTC stocks have more than one market-maker. Market-makers generally must be ready to buy and sell at least 100 shares of a stock they make a market in. As a result, a large order from an investor may have to be filled by a number of market-makers at potentially different prices.
There can be a significant overlap between a “market maker” and “HFT firm”. HFT firms characterize their business as “Market making” – a set of high-frequency trading strategies that involve placing a limit order to sell (or offer) or a buy limit order (or bid) in order to earn the bid-ask spread. By doing so, market makers provide counterpart to incoming market orders. Although the role of market maker was traditionally fulfilled by specialist firms, this class of strategy is now implemented by a large range of investors, thanks to wide adoption of direct market access. As pointed out by empirical studies, this renewed competition among liquidity providers causes reduced effective market spreads, and therefore reduced indirect costs for final investors.” A crucial distinction is that true market makers don’t exit the market at their discretion and are committed not to, where HFT firms are under no similar commitment.
Some high-frequency trading firms use market making as their primary strategy. Automated Trading Desk (ATD), which was bought by Citigroup in July 2007, has been an active market maker, accounting for about 6% of total volume on both the NASDAQ and the New York Stock Exchange. In May 2016, Citadel LLC bought assets of ATD from Citigroup. Building up market making strategies typically involves precise modeling of the target market microstructure together with stochastic control techniques.
These strategies appear intimately related to the entry of new electronic venues. Academic study of Chi-X’s entry into the European equity market reveals that its launch coincided with a large HFT that made markets using both the incumbent market, NYSE-Euronext, and the new market, Chi-X. The study shows that the new market provided ideal conditions for HFT market-making, low fees (i.e., rebates for quotes that led to execution) and a fast system, yet the HFT was equally active in the incumbent market to offload nonzero positions. New market entry and HFT arrival are further shown to coincide with a significant improvement in liquidity supply.
Further information: Quote stuffing
It is a form of market manipulation employed by high-frequency traders (HFT) that involves quickly entering and withdrawing a large number of orders in an attempt to flood the market creating confusion in the market and trading opportunities for high-frequency traders. 
Ticker tape trading
For other uses, see Ticker tape (disambiguation).
Much information happens to be unwittingly embedded in market data, such as quotes and volumes. By observing a flow of quotes, computers are capable of extracting information that has not yet crossed the news screens. Since all quote and volume information is public, such strategies are fully compliant with all the applicable laws.
Filter trading is one of the more primitive high-frequency trading strategies that involves monitoring large amounts of stocks for significant or unusual price changes or volume activity. This includes trading on announcements, news, or other event criteria. Software would then generate a buy or sell order depending on the nature of the event being looked for.
Tick trading often aims to recognize the beginnings of large orders being placed in the market. For example, a large order from a pension fund to buy will take place over several hours or even days, and will cause a rise in price due to increased demand. An arbitrageur can try to spot this happening then buy up the security, then profit from selling back to the pension fund. This strategy has become more difficult since the introduction of dedicated trade execution companies in the 2000s which provide optimal trading for pension and other funds, specifically designed to remove the arbitrage opportunity.
Certain recurring events generate predictable short-term responses in a selected set of securities. High-frequency traders take advantage of such predictability to generate short-term profits.
Another set of high-frequency trading strategies are strategies that exploit predictable temporary deviations from stable statistical relationships among securities. Statistical arbitrage at high frequencies is actively used in all liquid securities, including equities, bonds, futures, foreign exchange, etc. Such strategies may also involve classical arbitrage strategies, such as covered interest rate parity in the foreign exchange market, which gives a relationship between the prices of a domestic bond, a bond denominated in a foreign currency, the spot price of the currency, and the price of a forward contract on the currency. High-frequency trading allows similar arbitrages using models of greater complexity involving many more than four securities.
The TABB Group estimates that annual aggregate profits of high-frequency arbitrage strategies exceeded US$21 billion in 2009, although the Purdue study estimates the profits for all high frequency trading were US$5 billion in 2009.
Index arbitrage exploits index tracker funds which are bound to buy and sell large volumes of securities in proportion to their changing weights in indices. If a HFT firm is able to access and process information which predicts these changes before the tracker funds do so, they can buy up securities in advance of the trackers and sell them on to them at a profit.
Company news in electronic text format is available from many sources including commercial providers like Bloomberg, public news websites, and Twitter feeds. Automated systems can identify company names, keywords and sometimes semantics to make news-based trades before human traders can process the news.
A separate, “naïve” class of high-frequency trading strategies relies exclusively on ultra-low latency direct market access technology. In these strategies, computer scientists rely on speed to gain minuscule advantages in arbitraging price discrepancies in some particular security trading simultaneously on disparate markets.
Another aspect of low latency strategy has been the switch from fiber optic to microwave technology for long distance networking. Especially since 2011, there has been a trend to use microwaves to transmit data across key connections such as the one between New York City and Chicago. This is because microwaves travelling in air suffer a less than 1% speed reduction compared to light travelling in a vacuum, whereas with conventional fiber optics light travels over 30% slower.
Order properties strategies
High-frequency trading strategies may use properties derived from market data feeds to identify orders that are posted at sub-optimal prices. Such orders may offer a profit to their counterparties that high-frequency traders can try to obtain. Examples of these features include the age of an order or the sizes of displayed orders. Tracking important order properties may also allow trading strategies to have a more accurate prediction of the future price of a security.
The effects of algorithmic and high-frequency trading are the subject of ongoing research. High frequency trading causes regulatory concerns as a contributor to market fragility. Regulators claim these practices contributed to volatility in the May 6, 2010 Flash Crash and find that risk controls are much less stringent for faster trades.
Members of the financial industry generally claim high-frequency trading substantially improves market liquidity, narrows bid-offer spread, lowers volatility and makes trading and investing cheaper for other market participants.
An academic study found that, for large-cap stocks and in quiescent markets during periods of “generally rising stock prices”, high-frequency trading lowers the cost of trading and increases the informativeness of quotes;:31 however, it found “no significant effects for smaller-cap stocks”,:3 and “it remains an open question whether algorithmic trading and algorithmic liquidity supply are equally beneficial in more turbulent or declining markets. …algorithmic liquidity suppliers may simply turn off their machines when markets spike downward.”:31
In September 2011, market data vendor Nanex LLC published a report stating the contrary. They looked at the amount of quote traffic compared to the value of trade transactions over 4 and half years and saw a 10-fold decrease in efficiency. Nanex’s owner is an outspoken detractor of high-frequency trading. Many discussions about HFT focus solely on the frequency aspect of the algorithms and not on their decision-making logic (which is typically kept secret by the companies that develop them). This makes it difficult for observers to pre-identify market scenarios where HFT will dampen or amplify price fluctuations. The growing quote traffic compared to trade value could indicate that more firms are trying to profit from cross-market arbitrage techniques that do not add significant value through increased liquidity when measured globally.
More fully automated markets such as NASDAQ, Direct Edge, and BATS, in the US, gained market share from less automated markets such as the NYSE. Economies of scale in electronic trading contributed to lowering commissions and trade processing fees, and contributed to international mergers and consolidation of financial exchanges.
The speeds of computer connections, measured in milliseconds or microseconds, have become important. Competition is developing among exchanges for the fastest processing times for completing trades. For example, in 2009 the London Stock Exchange bought a technology firm called MillenniumIT and announced plans to implement its Millennium Exchange platform which they claim has an average latency of 126 microseconds. This allows sub-millisecond resolution timestamping of the order book. Off-the-shelf software currently allows for nanoseconds resolution of timestamps using a GPS clock with 100 nanoseconds precision.
Spending on computers and software in the financial industry increased to $26.4 billion in 2005.
May 6, 2010 Flash Crash
Main article: 2010 Flash Crash
The brief but dramatic stock market crash of May 6, 2010 was initially thought to have been caused by high-frequency trading. The Dow Jones Industrial Average plunged to its largest intraday point loss, but not percentage loss, in history, only to recover much of those losses within minutes.
In the aftermath of the crash, several organizations argued that high-frequency trading was not to blame, and may even have been a major factor in minimizing and partially reversing the Flash Crash. CME Group, a large futures exchange, stated that, insofar as stock index futures traded on CME Group were concerned, its investigation had found no support for the notion that high-frequency trading was related to the crash, and actually stated it had a market stabilizing effect.
However, after almost five months of investigations, the U.S. Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC) issued a joint report identifying the cause that set off the sequence of events leading to the Flash Crash and concluding that the actions of high-frequency trading firms contributed to volatility during the crash.
The report found that the cause was a single sale of $4.1 billion in futures contracts by a mutual fund, identified as Waddell & Reed Financial, in an aggressive attempt to hedge its investment position. The joint report also found that “high-frequency traders quickly magnified the impact of the mutual fund’s selling.” The joint report “portrayed a market so fragmented and fragile that a single large trade could send stocks into a sudden spiral”, that a large mutual fund firm “chose to sell a big number of futures contracts using a computer program that essentially ended up wiping out available buyers in the market”, that as a result high-frequency firms “were also aggressively selling the E-mini contracts”, contributing to rapid price declines. The joint report also noted “HFTs began to quickly buy and then resell contracts to each other – generating a ‘hot-potato’ volume effect as the same positions were passed rapidly back and forth.” The combined sales by Waddell and high-frequency firms quickly drove “the E-mini price down 3% in just four minutes”. As prices in the futures market fell, there was a spillover into the equities markets where “the liquidity in the market evaporated because the automated systems used by most firms to keep pace with the market paused” and scaled back their trading or withdrew from the markets altogether. The joint report then noted that “Automatic computerized traders on the stock market shut down as they detected the sharp rise in buying and selling.” As computerized high-frequency traders exited the stock market, the resulting lack of liquidity “…caused shares of some prominent companies like Procter & Gamble and Accenture to trade down as low as a penny or as high as $100,000”. While some firms exited the market, high-frequency firms that remained in the market exacerbated price declines because they “‘escalated their aggressive selling’ during the downdraft”. In the years following the flash crash, academic researchers and experts from the CFTC pointed to high-frequency trading as just one component of the complex current U.S. market structure that led to the events of May 6, 2010.
Granularity and accuracy
In 2015 the Paris-based regulator of the 28-nation European Union, the European Securities and Markets Authority, proposed time standards to span the EU, that would more accurately synchronize trading clocks “to within a nanosecond, or one-billionth of a second” to refine regulation of gateway-to-gateway latency time—”the speed at which trading venues acknowledge an order after receiving a trade request”. Using these more detailed time-stamps, regulators would be better able to distinguish the order in which trade requests are received and executed, to identify market abuse and prevent potential manipulation of European securities markets by traders using advanced, powerful, fast computers and networks. The fastest technologies give traders an advantage over other “slower” investors as they can change prices of the securities they trade.
Risks and controversy
According to author Walter Mattli, the ability of regulators to enforce the rules has greatly declined since 2005 with the passing of the Regulation National Market System (Reg NMS) by the US Securities and Exchange Commission. As a result, the NYSE’s quasi monopoly role as a stock rule maker was undermined and turned the stock exchange into one of many globally operating exchanges. The market then became more fractured and granular, as did the regulatory bodies, and since stock exchanges had turned into entities also seeking to maximize profits, the one with the most lenient regulators were rewarded, and oversight over traders’ activities was lost. This fragmentation has greatly benefitted HFT.
High-frequency trading comprises many different types of algorithms. Various studies reported that certain types of market-making high-frequency trading reduces volatility and does not pose a systemic risk, and lowers transaction costs for retail investors, without impacting long term investors. Other studies, summarized in Aldridge, Krawciw, 2017 find that high-frequency trading strategies known as “aggressive” erode liquidity and cause volatility.
High-frequency trading has been the subject of intense public focus and debate since the May 6, 2010 Flash Crash. At least one Nobel Prize–winning economist, Michael Spence, believes that HFT should be banned. A working paper found “the presence of high frequency trading has significantly mitigated the frequency and severity of end-of-day price dislocation”.
In their joint report on the 2010 Flash Crash, the SEC and the CFTC stated that “market makers and other liquidity providers widened their quote spreads, others reduced offered liquidity, and a significant number withdrew completely from the markets” during the flash crash.
Politicians, regulators, scholars, journalists and market participants have all raised concerns on both sides of the Atlantic. This has led to discussion of whether high-frequency market makers should be subject to various kinds of regulations.
In a September 22, 2010 speech, SEC chairperson Mary Schapiro signaled that US authorities were considering the introduction of regulations targeted at HFT. She said, “high frequency trading firms have a tremendous capacity to affect the stability and integrity of the equity markets. Currently, however, high frequency trading firms are subject to very little in the way of obligations either to protect that stability by promoting reasonable price continuity in tough times, or to refrain from exacerbating price volatility.” She proposed regulation that would require high-frequency traders to stay active in volatile markets. A later SEC chair Mary Jo White pushed back against claims that high-frequency traders have an inherent benefit in the markets. SEC associate director Gregg Berman suggested that the current debate over HFT lacks perspective. In an April 2014 speech, Berman argued: “It’s much more than just the automation of quotes and cancels, in spite of the seemingly exclusive fixation on this topic by much of the media and various outspoken market pundits. (…) I worry that it may be too narrowly focused and myopic.”
The Chicago Federal Reserve letter of October 2012, titled “How to keep markets safe in an era of high-speed trading”, reports on the results of a survey of several dozen financial industry professionals including traders, brokers, and exchanges. It found that
risk controls were poorer in high-frequency trading, because of competitive time pressure to execute trades without the more extensive safety checks normally used in slower trades.
“some firms do not have stringent processes for the development, testing, and deployment of code used in their trading algorithms.”
“out-of control algorithms were more common than anticipated prior to the study and that there were no clear patterns as to their cause.”
The CFA Institute, a global association of investment professionals, advocated for reforms regarding high-frequency trading, including:
Promoting robust internal risk management procedures and controls over the algorithms and strategies employed by HFT firms.
Trading venues should disclose their fee structure to all market participants.
Regulators should address market manipulation and other threats to the integrity of markets, regardless of the underlying mechanism, and not try to intervene in the trading process or to restrict certain types of trading activities.
Exchanges offered a type of order called a “Flash” order (on NASDAQ, it was called “Bolt” on the Bats stock exchange) that allowed an order to lock the market (post at the same price as an order on the other side of the book[clarification needed]) for a small amount of time (5 milliseconds). This order type was available to all participants but since HFT’s adapted to the changes in market structure more quickly than others, they were able to use it to “jump the queue” and place their orders before other order types were allowed to trade at the given price. Currently, the majority of exchanges do not offer flash trading, or have discontinued it. By March 2011, the NASDAQ, BATS, and Direct Edge exchanges had all ceased offering its Competition for Price Improvement functionality (widely referred to as “flash technology/trading”).
On September 24, 2013, the Federal Reserve revealed that some traders are under investigation for possible news leak and insider trading. An anti-HFT firm called NANEX claimed that right after the Federal Reserve announced its newest decision, trades were registered in the Chicago futures market within two milliseconds. However, the news was released to the public in Washington D.C. at exactly 2:00 pm calibrated by atomic clock, and takes 3.19 milliseconds to reach Chicago at the speed of light in straight line and ca. 7 milliseconds in practice. Most of the conspiracy revolved around using inappropriate time stamps using times from the SIP (consolidated quote that is necessarily slow) and the amount of “jitter” that can happen when looking at such granular timings.
Violations and fines
Regulation and enforcement
See also: Regulation of algorithms
In March 2012, regulators fined Octeg LLC, the equities market-making unit of high-frequency trading firm Getco LLC, for $450,000. Octeg violated Nasdaq rules and failed to maintain proper supervision over its stock trading activities. The fine resulted from a request by Nasdaq OMX for regulators to investigate the activity at Octeg LLC from the day after the May 6, 2010 Flash Crash through the following December. Nasdaq determined the Getco subsidiary lacked reasonable oversight of its algo-driven high-frequency trading.
In October 2013, regulators fined Knight Capital $12 million for the trading malfunction that led to its collapse. Knight was found to have violated the SEC’s market access rule, in effect since 2010 to prevent such mistakes. Regulators stated the HFT firm ignored dozens of error messages before its computers sent millions of unintended orders to the market. Knight Capital eventually merged with Getco to form KCG Holdings. Knight lost over $460 million from its trading errors in August 2012 that caused disturbance in the U.S. stock market.
In September 2014, HFT firm Latour Trading LLC agreed to pay a SEC penalty of $16 million. Latour is a subsidiary of New York-based high-frequency trader Tower Research Capital LLC. According to the SEC’s order, for at least two years Latour underestimated the amount of risk it was taking on with its trading activities. By using faulty calculations, Latour managed to buy and sell stocks without holding enough capital. At times, the Tower Research Capital subsidiary accounted for 9% of all U.S. stock trading. The SEC noted the case is the largest penalty for a violation of the net capital rule.
In response to increased regulation, such as by FINRA, some have argued that instead of promoting government intervention, it would be more efficient to focus on a solution that mitigates information asymmetries among traders and their backers; others argue that regulation does not go far enough. In 2018, the European Union introduced the MiFID II/MiFIR regulation.
On January 12, 2015, the SEC announced a $14 million penalty against a subsidiary of BATS Global Markets, an exchange operator that was founded by high-frequency traders. The BATS subsidiary Direct Edge failed to properly disclose order types on its two exchanges EDGA and EDGX. These exchanges offered three variations of controversial “Hide Not Slide” orders and failed to accurately describe their priority to other orders. The SEC found the exchanges disclosed complete and accurate information about the order types “only to some members, including certain high-frequency trading firms that provided input about how the orders would operate”. The complaint was made in 2011 by Haim Bodek.
Reported in January 2015, UBS agreed to pay $14.4 million to settle charges of not disclosing an order type that allowed high-frequency traders to jump ahead of other participants. The SEC stated that UBS failed to properly disclose to all subscribers of its dark pool “the existence of an order type that it pitched almost exclusively to market makers and high-frequency trading firms”. UBS broke the law by accepting and ranking hundreds of millions of orders priced in increments of less than one cent, which is prohibited under Regulation NMS. The order type called PrimaryPegPlus enabled HFT firms “to place sub-penny-priced orders that jumped ahead of other orders submitted at legal, whole-penny prices”.
Main article: Quote stuffing
In June 2014, high-frequency trading firm Citadel LLC was fined $800,000 for violations that included quote stuffing. Nasdaq’s disciplinary action stated that Citadel “failed to prevent the strategy from sending millions of orders to the exchanges with few or no executions”. It was pointed out that Citadel “sent multiple, periodic bursts of order messages, at 10,000 orders per second, to the exchanges. This excessive messaging activity, which involved hundreds of thousands of orders for more than 19 million shares, occurred two to three times per day.”
Spoofing and layering
Main articles: Spoofing (finance) and Layering (finance)
In July 2013, it was reported that Panther Energy Trading LLC was ordered to pay $4.5 million to U.S. and U.K. regulators on charges that the firm’s high-frequency trading activities manipulated commodity markets. Panther’s computer algorithms placed and quickly canceled bids and offers in futures contracts including oil, metals, interest rates and foreign currencies, the U.S. Commodity Futures Trading Commission said. In October 2014, Panther’s sole owner Michael Coscia was charged with six counts of commodities fraud and six counts of “spoofing”. The indictment stated that Coscia devised a high-frequency trading strategy to create a false impression of the available liquidity in the market, “and to fraudulently induce other market participants to react to the deceptive market information he created”.
In November 7, 2019, it was reported that Tower Research Capital LLC was ordered to pay $67.4 million in fines to the CFTC to settle allegations that three former traders at the firm engaged in spoofing from at least March 2012 through December 2013. The New York-based firm entered into a deferred prosecution agreement with the Justice Department.
Main article: Market manipulation
In October 2014, Athena Capital Research LLC was fined $1 million on price manipulation charges. The high-speed trading firm used $40 million to rig prices of thousands of stocks, including eBay Inc, according to U.S. regulators. The HFT firm Athena manipulated closing prices commonly used to track stock performance with “high-powered computers, complex algorithms and rapid-fire trades”, the SEC said. The regulatory action is one of the first market manipulation cases against a firm engaged in high-frequency trading. Reporting by Bloomberg noted the HFT industry is “besieged by accusations that it cheats slower investors”.
Advanced trading platforms
Advanced computerized trading platforms and market gateways are becoming standard tools of most types of traders, including high-frequency traders. Broker-dealers now compete on routing order flow directly, in the fastest and most efficient manner, to the line handler where it undergoes a strict set of risk filters before hitting the execution venue(s). Ultra-low latency direct market access (ULLDMA) is a hot topic amongst brokers and technology vendors such as Goldman Sachs, Credit Suisse, and UBS. Typically, ULLDMA systems can currently handle high amounts of volume and boast round-trip order execution speeds (from hitting “transmit order” to receiving an acknowledgment) of 10 milliseconds or less.
Such performance is achieved with the use of hardware acceleration or even full-hardware processing of incoming market data, in association with high-speed communication protocols, such as 10 Gigabit Ethernet or PCI Express. More specifically, some companies provide full-hardware appliances based on FPGA technology to obtain sub-microsecond end-to-end market data processing.
Buy side traders made efforts to curb predatory HFT strategies. Brad Katsuyama, co-founder of the IEX, led a team that implemented THOR, a securities order-management system that splits large orders into smaller sub-orders that arrive at the same time to all the exchanges through the use of intentional delays. This largely prevents information leakage in the propagation of orders that high-speed traders can take advantage of. In 2016, after having with Intercontinental Exchange Inc. and others failed to prevent SEC approval of IEX’s launch and having failed to sue as it had threatened to do over the SEC approval, Nasdaq launched a “speed bump” product of its own to compete with IEX. According to Nasdaq CEO Robert Greifeld “the regulator shouldn’t have approved IEX without changing the rules that required quotes to be immediately visible”. The IEX speed bump—or trading slowdown—is 350 microseconds, which the SEC ruled was within the “immediately visible” parameter. The slowdown promises to impede HST ability “often [to] cancel dozens of orders for every trade they make”.
Outside of US equities, several notable spot foreign exchange (FX) trading platforms—including ParFX, EBS Market, and Thomson Reuters Matching—have implemented their own “speed bumps” to curb or otherwise limit HFT activity. Unlike the IEX fixed length delay that retains the temporal ordering of messages as they are received by the platform, the spot FX platforms’ speed bumps reorder messages so the first message received is not necessarily that processed for matching first. In short, the spot FX platforms’ speed bumps seek to reduce the benefit of a participant being faster than others, as has been described in various academic papers.
Key words to search
Complex event processing
Erlang (programming language) used by Goldman Sachs
Pump and dump
Algorithmic trading is a method of executing orders using automated pre-programmed trading instructions accounting for variables such as time, price, and volume. This type of trading was developed to make use of the speed and data processing advantages that computers have over human traders. Popular “algos” include Percentage of Volume, Pegged, VWAP, TWAP, Implementation shortfall and Target close. In the twenty-first century, algorithmic trading has been gaining traction with both retail and institutional traders. It is widely used by investment banks, pension funds, mutual funds, and hedge funds that may need to spread out the execution of a larger order or perform trades too fast for human traders to react to. A study in 2016 showed that over 80% of trading in the FOREX market was performed by trading algorithms rather than humans.
The term algorithmic trading is often used synonymously with automated trading system. These encompass trading strategies such as black box trading and Quantitative, or Quant, trading that are heavily reliant on complex mathematical formulas and high-speed computer programs.
Such systems run strategies including market making, inter-market spreading, arbitrage, or pure speculation such as trend following. Many fall into the category of high-frequency trading (HFT), which is characterized by high turnover and high order-to-trade ratios. HFT strategies utilize computers that make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe. As a result, in February 2012, the Commodity Futures Trading Commission (CFTC) formed a special working group that included academics and industry experts to advise the CFTC on how best to define HFT. Algorithmic trading and HFT have resulted in a dramatic change of the market microstructure, particularly in the way liquidity is provided.
1.1 Early developments
1.2 Refinement and growth
2 Emblematic examples
3.1 Trading ahead of index fund rebalancing
3.2 Pairs trading
3.3 Delta-neutral strategies
3.4.1 Conditions for arbitrage
3.5 Mean reversion
3.7 Transaction cost reduction
3.8 Strategies that only pertain to dark pools
3.9 Market timing
4 High-frequency trading
4.1 Market making
4.2 Statistical arbitrage
4.3 Event arbitrage
4.5 Quote stuffing
5 Low latency trading systems
6 Strategy implementation
7 Issues and developments
7.1 Cyborg finance
7.3 Recent developments
8 System architecture
10 Communication standards
11 Key words to search
Computerization of the order flow in financial markets began in the early 1970s, when the New York Stock Exchange introduced the “designated order turnaround” system (DOT). SuperDOT was introduced in 1984 as an upgraded version of DOT. Both systems allowed for the routing of orders electronically to the proper trading post. The “opening automated reporting system” (OARS) aided the specialist in determining the market clearing opening price (SOR; Smart Order Routing).
With the rise of fully electronic markets came the introduction of program trading, which is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over US$1 million total. In practice, program trades were pre-programmed to automatically enter or exit trades based on various factors. In the 1980s, program trading became widely used in trading between the S&P 500 equity and futures markets in a strategy known as index arbitrage.
At about the same time portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black–Scholes option pricing model.
Both strategies, often simply lumped together as “program trading”, were blamed by many people (for example by the Brady report) for exacerbating or even starting the 1987 stock market crash. Yet the impact of computer driven trading on stock market crashes is unclear and widely discussed in the academic community.
Refinement and growth
The financial landscape was changed again with the emergence of electronic communication networks (ECNs) in the 1990s, which allowed for trading of stock and currencies outside of traditional exchanges. In the U.S., decimalization changed the minimum tick size from 1/16 of a dollar (US$0.0625) to US$0.01 per share in 2001, and may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers’ trading advantage, thus increasing market liquidity.
This increased market liquidity led to institutional traders splitting up orders according to computer algorithms so they could execute orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time-weighted average price or more usually by the volume-weighted average price.
It is over. The trading that existed down the centuries has died. We have an electronic market today. It is the present. It is the future.
Robert Greifeld, NASDAQ CEO, April 2011
A further encouragement for the adoption of algorithmic trading in the financial markets came in 2001 when a team of IBM researchers published a paper at the International Joint Conference on Artificial Intelligence where they showed that in experimental laboratory versions of the electronic auctions used in the financial markets, two algorithmic strategies (IBM’s own MGD, and Hewlett-Packard’s ZIP) could consistently out-perform human traders. MGD was a modified version of the “GD” algorithm invented by Steven Gjerstad & John Dickhaut in 1996/7; the ZIP algorithm had been invented at HP by Dave Cliff (professor) in 1996. In their paper, the IBM team wrote that the financial impact of their results showing MGD and ZIP outperforming human traders “…might be measured in billions of dollars annually”; the IBM paper generated international media coverage.
In 2005, the Regulation National Market System was put in place by the SEC to strengthen the equity market. This changed the way firms traded with rules such as the Trade Through Rule, which mandates that market orders must be posted and executed electronically at the best available price, thus preventing brokerages from profiting from the price differences when matching buy and sell orders.
As more electronic markets opened, other algorithmic trading strategies were introduced. These strategies are more easily implemented by computers, because machines can react more rapidly to temporary mispricing and examine prices from several markets simultaneously. Chameleon (developed by BNP Paribas), Stealth (developed by the Deutsche Bank), Sniper and Guerilla (developed by Credit Suisse), arbitrage, statistical arbitrage, trend following, and mean reversion are examples of algorithmic trading strategies.
Profitability projections by the TABB Group, a financial services industry research firm, for the US equities HFT industry were US$1.3 billion before expenses for 2014, significantly down on the maximum of US$21 billion that the 300 securities firms and hedge funds that then specialized in this type of trading took in profits in 2008, which the authors had then called “relatively small” and “surprisingly modest” when compared to the market’s overall trading volume. In March 2014, Virtu Financial, a high-frequency trading firm, reported that during five years the firm as a whole was profitable on 1,277 out of 1,278 trading days, losing money just one day, demonstrating the possible benefit of trading thousands to millions of trades every trading day.
Algorithmic trading. Percentage of market volume.
A third of all European Union and United States stock trades in 2006 were driven by automatic programs, or algorithms. As of 2009, studies suggested HFT firms accounted for 60–73% of all US equity trading volume, with that number falling to approximately 50% in 2012. In 2006, at the London Stock Exchange, over 40% of all orders were entered by algorithmic traders, with 60% predicted for 2007. American markets and European markets generally have a higher proportion of algorithmic trades than other markets, and estimates for 2008 range as high as an 80% proportion in some markets. Foreign exchange markets also have active algorithmic trading, measured at about 80% of orders in 2016 (up from about 25% of orders in 2006). Futures markets are considered fairly easy to integrate into algorithmic trading, with about 20% of options volume expected to be computer-generated by 2010.[needs update] Bond markets are moving toward more access to algorithmic traders.
Algorithmic trading and HFT have been the subject of much public debate since the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission said in reports that an algorithmic trade entered by a mutual fund company triggered a wave of selling that led to the 2010 Flash Crash. The same reports found HFT strategies may have contributed to subsequent volatility by rapidly pulling liquidity from the market. As a result of these events, the Dow Jones Industrial Average suffered its second largest intraday point swing ever to that date, though prices quickly recovered. (See List of largest daily changes in the Dow Jones Industrial Average.) A July 2011 report by the International Organization of Securities Commissions (IOSCO), an international body of securities regulators, concluded that while “algorithms and HFT technology have been used by market participants to manage their trading and risk, their usage was also clearly a contributing factor in the flash crash event of May 6, 2010.” However, other researchers have reached a different conclusion. One 2010 study found that HFT did not significantly alter trading inventory during the Flash Crash. Some algorithmic trading ahead of index fund rebalancing transfers profits from investors.
Trading ahead of index fund rebalancing
Most retirement savings, such as private pension funds or 401(k) and individual retirement accounts in the US, are invested in mutual funds, the most popular of which are index funds which must periodically “rebalance” or adjust their portfolio to match the new prices and market capitalization of the underlying securities in the stock or other index that they track. Profits are transferred from passive index investors to active investors, some of whom are algorithmic traders specifically exploiting the index rebalance effect. The magnitude of these losses incurred by passive investors has been estimated at 21–28bp per year for the S&P 500 and 38–77bp per year for the Russell 2000. John Montgomery of Bridgeway Capital Management says that the resulting “poor investor returns” from trading ahead of mutual funds is “the elephant in the room” that “shockingly, people are not talking about”.
Pairs trading or pair trading is a long-short, ideally market-neutral strategy enabling traders to profit from transient discrepancies in relative value of close substitutes. Unlike in the case of classic arbitrage, in case of pairs trading, the law of one price cannot guarantee convergence of prices. This is especially true when the strategy is applied to individual stocks – these imperfect substitutes can in fact diverge indefinitely. In theory the long-short nature of the strategy should make it work regardless of the stock market direction. In practice, execution risk, persistent and large divergences, as well as a decline in volatility can make this strategy unprofitable for long periods of time (e.g. 2004-2007). It belongs to wider categories of statistical arbitrage, convergence trading, and relative value strategies.
In finance, delta-neutral describes a portfolio of related financial securities, in which the portfolio value remains unchanged due to small changes in the value of the underlying security. Such a portfolio typically contains options and their corresponding underlying securities such that positive and negative delta components offset, resulting in the portfolio’s value being relatively insensitive to changes in the value of the underlying security.
In economics and finance, arbitrage /ˈɑːrbɪtrɑːʒ/ is the practice of taking advantage of a price difference between two or more markets: striking a combination of matching deals that capitalize upon the imbalance, the profit being the difference between the market prices. When used by academics, an arbitrage is a transaction that involves no negative cash flow at any probabilistic or temporal state and a positive cash flow in at least one state; in simple terms, it is the possibility of a risk-free profit at zero cost. Example: One of the most popular Arbitrage trading opportunities is played with the S&P futures and the S&P 500 stocks. During most trading days these two will develop disparity in the pricing between the two of them. This happens when the price of the stocks which are mostly traded on the NYSE and NASDAQ markets either get ahead or behind the S&P Futures which are traded in the CME market.
Conditions for arbitrage
Arbitrage is possible when one of three conditions is met:
The same asset does not trade at the same price on all markets (the “law of one price” is temporarily violated).
Two assets with identical cash flows do not trade at the same price.
An asset with a known price in the future does not today trade at its future price discounted at the risk-free interest rate (or, the asset does not have negligible costs of storage; as such, for example, this condition holds for grain but not for securities).
Arbitrage is not simply the act of buying a product in one market and selling it in another for a higher price at some later time. The long and short transactions should ideally occur simultaneously to minimize the exposure to market risk, or the risk that prices may change on one market before both transactions are complete. In practical terms, this is generally only possible with securities and financial products which can be traded electronically, and even then, when first leg(s) of the trade is executed, the prices in the other legs may have worsened, locking in a guaranteed loss. Missing one of the legs of the trade (and subsequently having to open it at a worse price) is called ‘execution risk’ or more specifically ‘leg-in and leg-out risk’.[a]
In the simplest example, any good sold in one market should sell for the same price in another. Traders may, for example, find that the price of wheat is lower in agricultural regions than in cities, purchase the good, and transport it to another region to sell at a higher price. This type of price arbitrage is the most common, but this simple example ignores the cost of transport, storage, risk, and other factors. “True” arbitrage requires that there be no market risk involved. Where securities are traded on more than one exchange, arbitrage occurs by simultaneously buying in one and selling on the other. Such simultaneous execution, if perfect substitutes are involved, minimizes capital requirements, but in practice never creates a “self-financing” (free) position, as many sources incorrectly assume following the theory. As long as there is some difference in the market value and riskiness of the two legs, capital would have to be put up in order to carry the long-short arbitrage position.
Mean reversion is a mathematical methodology sometimes used for stock investing, but it can be applied to other processes. In general terms the idea is that both a stock’s high and low prices are temporary, and that a stock’s price tends to have an average price over time. An example of a mean-reverting process is the Ornstein-Uhlenbeck stochastic equation.
Mean reversion involves first identifying the trading range for a stock, and then computing the average price using analytical techniques as it relates to assets, earnings, etc.
When the current market price is less than the average price, the stock is considered attractive for purchase, with the expectation that the price will rise. When the current market price is above the average price, the market price is expected to fall. In other words, deviations from the average price are expected to revert to the average.
The standard deviation of the most recent prices (e.g., the last 20) is often used as a buy or sell indicator.
Stock reporting services (such as Yahoo! Finance, MS Investor, Morningstar, etc.), commonly offer moving averages for periods such as 50 and 100 days. While reporting services provide the averages, identifying the high and low prices for the study period is still necessary.
Scalping is liquidity provision by non-traditional market makers, whereby traders attempt to earn (or make) the bid-ask spread. This procedure allows for profit for so long as price moves are less than this spread and normally involves establishing and liquidating a position quickly, usually within minutes or less.
A market maker is basically a specialized scalper. The volume a market maker trades is many times more than the average individual scalper and would make use of more sophisticated trading systems and technology. However, registered market makers are bound by exchange rules stipulating their minimum quote obligations. For instance, NASDAQ requires each market maker to post at least one bid and one ask at some price level, so as to maintain a two-sided market for each stock represented.
Transaction cost reduction
Most strategies referred to as algorithmic trading (as well as algorithmic liquidity-seeking) fall into the cost-reduction category. The basic idea is to break down a large order into small orders and place them in the market over time. The choice of algorithm depends on various factors, with the most important being volatility and liquidity of the stock. For example, for a highly liquid stock, matching a certain percentage of the overall orders of stock (called volume inline algorithms) is usually a good strategy, but for a highly illiquid stock, algorithms try to match every order that has a favorable price (called liquidity-seeking algorithms).
The success of these strategies is usually measured by comparing the average price at which the entire order was executed with the average price achieved through a benchmark execution for the same duration. Usually, the volume-weighted average price is used as the benchmark. At times, the execution price is also compared with the price of the instrument at the time of placing the order.
A special class of these algorithms attempts to detect algorithmic or iceberg orders on the other side (i.e. if you are trying to buy, the algorithm will try to detect orders for the sell side). These algorithms are called sniffing algorithms. A typical example is “Stealth”.
Some examples of algorithms are VWAP, TWAP, Implementation shortfall, POV, Display size, Liquidity seeker, and Stealth. Modern algorithms are often optimally constructed via either static or dynamic programming .  
Strategies that only pertain to dark pools
Recently, HFT, which comprises a broad set of buy-side as well as market making sell side traders, has become more prominent and controversial. These algorithms or techniques are commonly given names such as “Stealth” (developed by the Deutsche Bank), “Iceberg”, “Dagger”, “Guerrilla”, “Sniper”, “BASOR” (developed by Quod Financial) and “Sniffer”. Dark pools are alternative trading systems that are private in nature—and thus do not interact with public order flow—and seek instead to provide undisplayed liquidity to large blocks of securities. In dark pools trading takes place anonymously, with most orders hidden or “iceberged”. Gamers or “sharks” sniff out large orders by “pinging” small market orders to buy and sell. When several small orders are filled the sharks may have discovered the presence of a large iceberged order.
“Now it’s an arms race,” said Andrew Lo, director of the Massachusetts Institute of Technology’s Laboratory for Financial Engineering. “Everyone is building more sophisticated algorithms, and the more competition exists, the smaller the profits.”
Strategies designed to generate alpha are considered market timing strategies. These types of strategies are designed using a methodology that includes backtesting, forward testing and live testing. Market timing algorithms will typically use technical indicators such as moving averages but can also include pattern recognition logic implemented using Finite State Machines.
Backtesting the algorithm is typically the first stage and involves simulating the hypothetical trades through an in-sample data period. Optimization is performed in order to determine the most optimal inputs. Steps taken to reduce the chance of over optimization can include modifying the inputs +/- 10%, schmooing the inputs in large steps, running monte carlo simulations and ensuring slippage and commission is accounted for.
Forward testing the algorithm is the next stage and involves running the algorithm through an out of sample data set to ensure the algorithm performs within backtested expectations.
Live testing is the final stage of development and requires the developer to compare actual live trades with both the backtested and forward tested models. Metrics compared include percent profitable, profit factor, maximum drawdown and average gain per trade.
As noted above, high-frequency trading (HFT) is a form of algorithmic trading characterized by high turnover and high order-to-trade ratios. Although there is no single definition of HFT, among its key attributes are highly sophisticated algorithms, specialized order types, co-location, very short-term investment horizons, and high cancellation rates for orders. In the U.S., high-frequency trading (HFT) firms represent 2% of the approximately 20,000 firms operating today, but account for 73% of all equity trading volume. As of the first quarter in 2009, total assets under management for hedge funds with HFT strategies were US$141 billion, down about 21% from their high. The HFT strategy was first made successful by Renaissance Technologies.
High-frequency funds started to become especially popular in 2007 and 2008. Many HFT firms are market makers and provide liquidity to the market, which has lowered volatility and helped narrow Bid-offer spreads making trading and investing cheaper for other market participants. HFT has been a subject of intense public focus since the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission stated that both algorithmic trading and HFT contributed to volatility in the 2010 Flash Crash. Among the major U.S. high frequency trading firms are Chicago Trading, Virtu Financial, Timber Hill, ATD, GETCO, and Citadel LLC.
There are four key categories of HFT strategies: market-making based on order flow, market-making based on tick data information, event arbitrage and statistical arbitrage. All portfolio-allocation decisions are made by computerized quantitative models. The success of computerized strategies is largely driven by their ability to simultaneously process volumes of information, something ordinary human traders cannot do.
Market making involves placing a limit order to sell (or offer) above the current market price or a buy limit order (or bid) below the current price on a regular and continuous basis to capture the bid-ask spread. Automated Trading Desk, which was bought by Citigroup in July 2007, has been an active market maker, accounting for about 6% of total volume on both NASDAQ and the New York Stock Exchange.
Another set of HFT strategies in classical arbitrage strategy might involve several securities such as covered interest rate parity in the foreign exchange market which gives a relation between the prices of a domestic bond, a bond denominated in a foreign currency, the spot price of the currency, and the price of a forward contract on the currency. If the market prices are sufficiently different from those implied in the model to cover transaction cost then four transactions can be made to guarantee a risk-free profit. HFT allows similar arbitrages using models of greater complexity involving many more than 4 securities. The TABB Group estimates that annual aggregate profits of low latency arbitrage strategies currently exceed US$21 billion.
A wide range of statistical arbitrage strategies have been developed whereby trading decisions are made on the basis of deviations from statistically significant relationships. Like market-making strategies, statistical arbitrage can be applied in all asset classes.
A subset of risk, merger, convertible, or distressed securities arbitrage that counts on a specific event, such as a contract signing, regulatory approval, judicial decision, etc., to change the price or rate relationship of two or more financial instruments and permit the arbitrageur to earn a profit.
Merger arbitrage also called risk arbitrage would be an example of this. Merger arbitrage generally consists of buying the stock of a company that is the target of a takeover while shorting the stock of the acquiring company. Usually the market price of the target company is less than the price offered by the acquiring company. The spread between these two prices depends mainly on the probability and the timing of the takeover being completed as well as the prevailing level of interest rates. The bet in a merger arbitrage is that such a spread will eventually be zero, if and when the takeover is completed. The risk is that the deal “breaks” and the spread massively widens.
One strategy that some traders have employed, which has been proscribed yet likely continues, is called spoofing. It is the act of placing orders to give the impression of wanting to buy or sell shares, without ever having the intention of letting the order execute to temporarily manipulate the market to buy or sell shares at a more favorable price. This is done by creating limit orders outside the current bid or ask price to change the reported price to other market participants. The trader can subsequently place trades based on the artificial change in price, then canceling the limit orders before they are executed.
Suppose a trader desires to sell shares of a company with a current bid of $20 and a current ask of $20.20. The trader would place a buy order at $20.10, still some distance from the ask so it will not be executed, and the $20.10 bid is reported as the National Best Bid and Offer best bid price. The trader then executes a market order for the sale of the shares they wished to sell. Because the best bid price is the investor’s artificial bid, a market maker fills the sale order at $20.10, allowing for a $.10 higher sale price per share. The trader subsequently cancels their limit order on the purchase he never had the intention of completing.
Quote stuffing is a tactic employed by malicious traders that involves quickly entering and withdrawing large quantities of orders in an attempt to flood the market, thereby gaining an advantage over slower market participants. The rapidly placed and canceled orders cause market data feeds that ordinary investors rely on to delay price quotes while the stuffing is occurring. HFT firms benefit from proprietary, higher-capacity feeds and the most capable, lowest latency infrastructure. Researchers showed high-frequency traders are able to profit by the artificially induced latencies and arbitrage opportunities that result from quote stuffing.
Low latency trading systems
Network-induced latency, a synonym for delay, measured in one-way delay or round-trip time, is normally defined as how much time it takes for a data packet to travel from one point to another. Low latency trading refers to the algorithmic trading systems and network routes used by financial institutions connecting to stock exchanges and electronic communication networks (ECNs) to rapidly execute financial transactions. Most HFT firms depend on low latency execution of their trading strategies. Joel Hasbrouck and Gideon Saar (2013) measure latency based on three components: the time it takes for (1) information to reach the trader, (2) the trader’s algorithms to analyze the information, and (3) the generated action to reach the exchange and get implemented. In a contemporary electronic market (circa 2009), low latency trade processing time was qualified as under 10 milliseconds, and ultra-low latency as under 1 millisecond.
Low-latency traders depend on ultra-low latency networks. They profit by providing information, such as competing bids and offers, to their algorithms microseconds faster than their competitors. The revolutionary advance in speed has led to the need for firms to have a real-time, colocated trading platform to benefit from implementing high-frequency strategies. Strategies are constantly altered to reflect the subtle changes in the market as well as to combat the threat of the strategy being reverse engineered by competitors. This is due to the evolutionary nature of algorithmic trading strategies – they must be able to adapt and trade intelligently, regardless of market conditions, which involves being flexible enough to withstand a vast array of market scenarios. As a result, a significant proportion of net revenue from firms is spent on the R&D of these autonomous trading systems.
Most of the algorithmic strategies are implemented using modern programming languages, although some still implement strategies designed in spreadsheets. Increasingly, the algorithms used by large brokerages and asset managers are written to the FIX Protocol’s Algorithmic Trading Definition Language (FIXatdl), which allows firms receiving orders to specify exactly how their electronic orders should be expressed. Orders built using FIXatdl can then be transmitted from traders’ systems via the FIX Protocol. Basic models can rely on as little as a linear regression, while more complex game-theoretic and pattern recognition or predictive models can also be used to initiate trading. More complex methods such as Markov chain Monte Carlo have been used to create these models.
Issues and developments
Algorithmic trading has been shown to substantially improve market liquidity among other benefits. However, improvements in productivity brought by algorithmic trading have been opposed by human brokers and traders facing stiff competition from computers.
Technological advances in finance, particularly those relating to algorithmic trading, has increased financial speed, connectivity, reach, and complexity while simultaneously reducing its humanity. Computers running software based on complex algorithms have replaced humans in many functions in the financial industry. Finance is essentially becoming an industry where machines and humans share the dominant roles – transforming modern finance into what one scholar has called, “cyborg finance”.
While many experts laud the benefits of innovation in computerized algorithmic trading, other analysts have expressed concern with specific aspects of computerized trading.
“The downside with these systems is their black box-ness,” Mr. Williams said. “Traders have intuitive senses of how the world works. But with these systems you pour in a bunch of numbers, and something comes out the other end, and it’s not always intuitive or clear why the black box latched onto certain data or relationships.”
“The Financial Services Authority has been keeping a watchful eye on the development of black box trading. In its annual report the regulator remarked on the great benefits of efficiency that new technology is bringing to the market. But it also pointed out that ‘greater reliance on sophisticated technology and modelling brings with it a greater risk that systems failure can result in business interruption’.”
UK Treasury minister Lord Myners has warned that companies could become the “playthings” of speculators because of automatic high-frequency trading. Lord Myners said the process risked destroying the relationship between an investor and a company.
Other issues include the technical problem of latency or the delay in getting quotes to traders, security and the possibility of a complete system breakdown leading to a market crash.
“Goldman spends tens of millions of dollars on this stuff. They have more people working in their technology area than people on the trading desk…The nature of the markets has changed dramatically.”
On August 1, 2012 Knight Capital Group experienced a technology issue in their automated trading system, causing a loss of $440 million.
This issue was related to Knight’s installation of trading software and resulted in Knight sending numerous erroneous orders in NYSE-listed securities into the market. This software has been removed from the company’s systems. … Clients were not negatively affected by the erroneous orders, and the software issue was limited to the routing of certain listed stocks to NYSE. Knight has traded out of its entire erroneous trade position, which has resulted in a realized pre-tax loss of approximately $440 million.
Algorithmic and high-frequency trading were shown to have contributed to volatility during the May 6, 2010 Flash Crash, when the Dow Jones Industrial Average plunged about 600 points only to recover those losses within minutes. At the time, it was the second largest point swing, 1,010.14 points, and the biggest one-day point decline, 998.5 points, on an intraday basis in Dow Jones Industrial Average history.
Financial market news is now being formatted by firms such as Need To Know News, Thomson Reuters, Dow Jones, and Bloomberg, to be read and traded on via algorithms.
“Computers are now being used to generate news stories about company earnings results or economic statistics as they are released. And this almost instantaneous information forms a direct feed into other computers which trade on the news.”
The algorithms do not simply trade on simple news stories but also interpret more difficult to understand news. Some firms are also attempting to automatically assign sentiment (deciding if the news is good or bad) to news stories so that automated trading can work directly on the news story.
“Increasingly, people are looking at all forms of news and building their own indicators around it in a semi-structured way,” as they constantly seek out new trading advantages said Rob Passarella, global director of strategy at Dow Jones Enterprise Media Group. His firm provides both a low latency news feed and news analytics for traders. Passarella also pointed to new academic research being conducted on the degree to which frequent Google searches on various stocks can serve as trading indicators, the potential impact of various phrases and words that may appear in Securities and Exchange Commission statements and the latest wave of online communities devoted to stock trading topics.
“Markets are by their very nature conversations, having grown out of coffee houses and taverns,” he said. So the way conversations get created in a digital society will be used to convert news into trades, as well, Passarella said.
“There is a real interest in moving the process of interpreting news from the humans to the machines” says Kirsti Suutari, global business manager of algorithmic trading at Reuters. “More of our customers are finding ways to use news content to make money.”
An example of the importance of news reporting speed to algorithmic traders was an advertising campaign by Dow Jones (appearances included page W15 of The Wall Street Journal, on March 1, 2008) claiming that their service had beaten other news services by two seconds in reporting an interest rate cut by the Bank of England.
In July 2007, Citigroup, which had already developed its own trading algorithms, paid $680 million for Automated Trading Desk, a 19-year-old firm that trades about 200 million shares a day. Citigroup had previously bought Lava Trading and OnTrade Inc.
In late 2010, The UK Government Office for Science initiated a Foresight project investigating the future of computer trading in the financial markets, led by Dame Clara Furse, ex-CEO of the London Stock Exchange and in September 2011 the project published its initial findings in the form of a three-chapter working paper available in three languages, along with 16 additional papers that provide supporting evidence. All of these findings are authored or co-authored by leading academics and practitioners, and were subjected to anonymous peer-review. Released in 2012, the Foresight study acknowledged issues related to periodic illiquidity, new forms of manipulation and potential threats to market stability due to errant algorithms or excessive message traffic. However, the report was also criticized for adopting “standard pro-HFT arguments” and advisory panel members being linked to the HFT industry.
A traditional trading system consists primarily of two blocks – one that receives the market data while the other that sends the order request to the exchange. However, an algorithmic trading system can be broken down into three parts:
Exchange(s) provide data to the system, which typically consists of the latest order book, traded volumes, and last traded price (LTP) of scrip. The server in turn receives the data simultaneously acting as a store for historical database. The data is analyzed at the application side, where trading strategies are fed from the user and can be viewed on the GUI. Once the order is generated, it is sent to the order management system (OMS), which in turn transmits it to the exchange.
Gradually, old-school, high latency architecture of algorithmic systems is being replaced by newer, state-of-the-art, high infrastructure, low-latency networks. The complex event processing engine (CEP), which is the heart of decision making in algo-based trading systems, is used for order routing and risk management.
With the emergence of the FIX (Financial Information Exchange) protocol, the connection to different destinations has become easier and the go-to market time has reduced, when it comes to connecting with a new destination. With the standard protocol in place, integration of third-party vendors for data feeds is not cumbersome anymore.
Though its development may have been prompted by decreasing trade sizes caused by decimalization, algorithmic trading has reduced trade sizes further. Jobs once done by human traders are being switched to computers. The speeds of computer connections, measured in milliseconds and even microseconds, have become very important.
More fully automated markets such as NASDAQ, Direct Edge and BATS (formerly an acronym for Better Alternative Trading System) in the US, have gained market share from less automated markets such as the NYSE. Economies of scale in electronic trading have contributed to lowering commissions and trade processing fees, and contributed to international mergers and consolidation of financial exchanges.
Competition is developing among exchanges for the fastest processing times for completing trades. For example, in June 2007, the London Stock Exchange launched a new system called TradElect that promises an average 10 millisecond turnaround time from placing an order to final confirmation and can process 3,000 orders per second. Since then, competitive exchanges have continued to reduce latency with turnaround times of 3 milliseconds available. This is of great importance to high-frequency traders, because they have to attempt to pinpoint the consistent and probable performance ranges of given financial instruments. These professionals are often dealing in versions of stock index funds like the E-mini S&Ps, because they seek consistency and risk-mitigation along with top performance. They must filter market data to work into their software programming so that there is the lowest latency and highest liquidity at the time for placing stop-losses and/or taking profits. With high volatility in these markets, this becomes a complex and potentially nerve-wracking endeavor, where a small mistake can lead to a large loss. Absolute frequency data play into the development of the trader’s pre-programmed instructions.
In the U.S., spending on computers and software in the financial industry increased to $26.4 billion in 2005.
Algorithmic trading has caused a shift in the types of employees working in the financial industry. For example, many physicists have entered the financial industry as quantitative analysts. Some physicists have even begun to do research in economics as part of doctoral research. This interdisciplinary movement is sometimes called econophysics. Some researchers also cite a “cultural divide” between employees of firms primarily engaged in algorithmic trading and traditional investment managers. Algorithmic trading has encouraged an increased focus on data and had decreased emphasis on sell-side research.
Algorithmic trades require communicating considerably more parameters than traditional market and limit orders. A trader on one end (the “buy side”) must enable their trading system (often called an “order management system” or “execution management system”) to understand a constantly proliferating flow of new algorithmic order types. The R&D and other costs to construct complex new algorithmic orders types, along with the execution infrastructure, and marketing costs to distribute them, are fairly substantial. What was needed was a way that marketers (the “sell side”) could express algo orders electronically such that buy-side traders could just drop the new order types into their system and be ready to trade them without constant coding custom new order entry screens each time.
FIX Protocol is a trade association that publishes free, open standards in the securities trading area. The FIX language was originally created by Fidelity Investments, and the association Members include virtually all large and many midsized and smaller broker dealers, money center banks, institutional investors, mutual funds, etc. This institution dominates standard setting in the pretrade and trade areas of security transactions. In 2006–2007 several members got together and published a draft XML standard for expressing algorithmic order types. The standard is called FIX Algorithmic Trading Definition Language (FIXatdl).
Key words to search
2010 Flash Crash
Alternative trading system
Complex event processing
Electronic trading platform
Day trading software is computer software intended to facilitate day trading of stocks or other financial instruments.
1 Types of software
1.1 Important Data
1.3 Trade Execution
2 Key words to search
Types of software
Day trading software falls into three main categories: data, charting, and trade execution.
A day trader needs to know the prices of the stocks, futures, or currencies that it wants to trade. In the case of stocks and futures, those prices come from the exchange where they are traded. Forex is a little different as there is no central exchange.
The vast majority of day traders will chart prices in some kind of charting software. Many charting vendors also supply data feeds.
Charting packages all tend to offer the same basic technical analysis indicators. Advanced packages often include a complete programming language for creating more indicators, or testing different trading strategies.
Once traders have their data and can see and analyze it on a chart, they will at some point want to place a trade. To do so, they need to use some kind of trade execution software or electronic trading platform. Many trade execution software allow advanced traders to develop their own trading strategies by using an application programming interface.
Most stock brokerage firms will provide proprietary software linked directly to their in-house systems, but many third party applications are available through Independent software vendors. The advantage of third party programs is that they allow the trader to trade through different brokers whilst retaining the same interface. They may also offer a number of advanced features such as automatic trade execution.
Key words to search
Electronic trading platform
Extended hours trading
Reuters 3000 Xtra
Technical analysis software
In finance, technical analysis is an analysis methodology for forecasting the direction of prices through the study of past market data, primarily price and volume. Behavioral economics and quantitative analysis use many of the same tools of technical analysis, which, being an aspect of active management, stands in contradiction to much of modern portfolio theory. The efficacy of both technical and fundamental analysis is disputed by the efficient-market hypothesis, which states that stock market prices are essentially unpredictable.
2 General description
4.1 Market action discounts everything
4.2 Prices move in trends
4.3 History tends to repeat itself
6 Systematic trading
6.1 Neural networks
7 Combination with other market forecast methods
8 Empirical evidence
8.1 Efficient-market hypothesis
8.1.1 Random walk hypothesis
9 Scientific technical analysis
10 Ticker-tape reading
11 Quotation board
12 Charting terms and indicators
12.2 Types of charts
12.4 Breadth indicators
12.5 Price-based indicators
12.6 Volume-based indicators
12.7 Trading with Mixing Indicators
13 Key words to search
The principles of technical analysis are derived from hundreds of years of financial market data. Some aspects of technical analysis began to appear in Amsterdam-based merchant Joseph de la Vega’s accounts of the Dutch financial markets in the 17th century. In Asia, technical analysis is said to be a method developed by Homma Munehisa during the early 18th century which evolved into the use of candlestick techniques, and is today a technical analysis charting tool. In the 1920s and 1930s, Richard W. Schabacker published several books which continued the work of Charles Dow and William Peter Hamilton in their books Stock Market Theory and Practice and Technical Market Analysis. In 1948, Robert D. Edwards and John Magee published Technical Analysis of Stock Trends which is widely considered to be one of the seminal works of the discipline. It is exclusively concerned with trend analysis and chart patterns and remains in use to the present. Early technical analysis was almost exclusively the analysis of charts because the processing power of computers was not available for the modern degree of statistical analysis. Charles Dow reportedly originated a form of point and figure chart analysis. With the emergence of behavioural finance as a separate discipline in economics, Paul V. Azzopardi combined technical analysis with behavioural finance and coined the term “Behavioural Technical Analysis”.
Dow theory is based on the collected writings of Dow Jones co-founder and editor Charles Dow, and inspired the use and development of modern technical analysis at the end of the 19th century. Other pioneers of analysis techniques include Ralph Nelson Elliott, William Delbert Gann and Richard Wyckoff who developed their respective techniques in the early 20th century. More technical tools and theories have been developed and enhanced in recent decades, with an increasing emphasis on computer-assisted techniques using specially designed computer software.
Fundamental analysts examine earnings, dividends, assets, quality, ratio, new products, research and the like. Technicians employ many methods, tools and techniques as well, one of which is the use of charts. Using charts, technical analysts seek to identify price patterns and market trends in financial markets and attempt to exploit those patterns.
Technicians using charts search for archetypal price chart patterns, such as the well-known head and shoulders or double top/bottom reversal patterns, study technical indicators, moving averages, and look for forms such as lines of support, resistance, channels, and more obscure formations such as flags, pennants, balance days and cup and handle patterns.
Technical analysts also widely use market indicators of many sorts, some of which are mathematical transformations of price, often including up and down volume, advance/decline data and other inputs. These indicators are used to help assess whether an asset is trending, and if it is, the probability of its direction and of continuation. Technicians also look for relationships between price/volume indices and market indicators. Examples include the moving average, relative strength index, and MACD. Other avenues of study include correlations between changes in Options (implied volatility) and put/call ratios with price. Also important are sentiment indicators such as Put/Call ratios, bull/bear ratios, short interest, Implied Volatility, etc.
There are many techniques in technical analysis. Adherents of different techniques (for example: Candlestick analysis, the oldest form of technical analysis developed by a Japanese grain trader; Harmonics; Dow theory; and Elliott wave theory) may ignore the other approaches, yet many traders combine elements from more than one technique. Some technical analysts use subjective judgment to decide which pattern(s) a particular instrument reflects at a given time and what the interpretation of that pattern should be. Others employ a strictly mechanical or systematic approach to pattern identification and interpretation.
Contrasting with technical analysis is fundamental analysis, the study of economic factors that influence the way investors price financial markets. Technical analysis holds that prices already reflect all the underlying fundamental factors. Uncovering the trends is what technical indicators are designed to do, although neither technical nor fundamental indicators are perfect. Some traders use technical or fundamental analysis exclusively, while others use both types to make trading decisions.
Technical analysis employs models and trading rules based on price and volume transformations, such as the relative strength index, moving averages, regressions, inter-market and intra-market price correlations, business cycles, stock market cycles or, classically, through recognition of chart patterns.
Technical analysis stands in contrast to the fundamental analysis approach to security and stock analysis. In the fundamental equation M = P/E technical analysis is the examination of M (multiple). Multiple encompasses the psychology generally abounding, i.e. the extent of willingness to buy/sell. Also in M is the ability to pay as, for instance, a spent-out bull can’t make the market go higher and a well-heeled bear won’t. Technical analysis analyzes price, volume, psychology, money flow and other market information, whereas fundamental analysis looks at the facts of the company, market, currency or commodity. Most large brokerage, trading group, or financial institutions will typically have both a technical analysis and fundamental analysis team.
In the 1960s and 1970s it was widely dismissed by academics. In a recent review, Irwin and Park reported that 56 of 95 modern studies found that it produces positive results but noted that many of the positive results were rendered dubious by issues such as data snooping, so that the evidence in support of technical analysis was inconclusive; it is still considered by many academics to be pseudoscience. Academics such as Eugene Fama say the evidence for technical analysis is sparse and is inconsistent with the weak form of the efficient-market hypothesis. Users hold that even if technical analysis cannot predict the future, it helps to identify trends, tendencies, and trading opportunities.
While some isolated studies have indicated that technical trading rules might lead to consistent returns in the period prior to 1987, most academic work has focused on the nature of the anomalous position of the foreign exchange market. It is speculated that this anomaly is due to central bank intervention, which obviously technical analysis is not designed to predict. Recent research suggests that combining various trading signals into a Combined Signal Approach may be able to increase profitability and reduce dependence on any single rule.
Stock chart showing levels of support (4,5,6, 7, and 8) and resistance (1, 2, and 3); levels of resistance tend to become levels of support and vice versa.
A core principle of technical analysis is that a market’s price reflects all relevant information impacting that market. A technical analyst therefore looks at the history of a security or commodity’s trading pattern rather than external drivers such as economic, fundamental and news events. It is believed that price action tends to repeat itself due to the collective, patterned behavior of investors. Hence technical analysis focuses on identifiable price trends and conditions.
Market action discounts everything
Based on the premise that all relevant information is already reflected by prices, technical analysts believe it is important to understand what investors think of that information, known and perceived.
Prices move in trends
See also: Market trend
Technical analysts believe that prices trend directionally, i.e., up, down, or sideways (flat) or some combination. The basic definition of a price trend was originally put forward by Dow theory.
An example of a security that had an apparent trend is AOL from November 2001 through August 2002. A technical analyst or trend follower recognizing this trend would look for opportunities to sell this security. AOL consistently moves downward in price. Each time the stock rose, sellers would enter the market and sell the stock; hence the “zig-zag” movement in the price. The series of “lower highs” and “lower lows” is a tell tale sign of a stock in a down trend. In other words, each time the stock moved lower, it fell below its previous relative low price. Each time the stock moved higher, it could not reach the level of its previous relative high price.
Note that the sequence of lower lows and lower highs did not begin until August. Then AOL makes a low price that does not pierce the relative low set earlier in the month. Later in the same month, the stock makes a relative high equal to the most recent relative high. In this a technician sees strong indications that the down trend is at least pausing and possibly ending, and would likely stop actively selling the stock at that point.
History tends to repeat itself
Technical analysts believe that investors collectively repeat the behavior of the investors that preceded them. To a technician, the emotions in the market may be irrational, but they exist. Because investor behavior repeats itself so often, technicians believe that recognizable (and predictable) price patterns will develop on a chart. Recognition of these patterns can allow the technician to select trades that have a higher probability of success.
Technical analysis is not limited to charting, but it always considers price trends. For example, many technicians monitor surveys of investor sentiment. These surveys gauge the attitude of market participants, specifically whether they are bearish or bullish. Technicians use these surveys to help determine whether a trend will continue or if a reversal could develop; they are most likely to anticipate a change when the surveys report extreme investor sentiment. Surveys that show overwhelming bullishness, for example, are evidence that an uptrend may reverse; the premise being that if most investors are bullish they have already bought the market (anticipating higher prices). And because most investors are bullish and invested, one assumes that few buyers remain. This leaves more potential sellers than buyers, despite the bullish sentiment. This suggests that prices will trend down, and is an example of contrarian trading.
The industry is globally represented by the International Federation of Technical Analysts (IFTA), which is a federation of regional and national organizations. In the United States, the industry is represented by both the CMT Association and the American Association of Professional Technical Analysts (AAPTA). The United States is also represented by the Technical Security Analysts Association of San Francisco (TSAASF). In the United Kingdom, the industry is represented by the Society of Technical Analysts (STA). The STA was a founding member of IFTA, has recently celebrated its 50th Anniversary and certifies analysts with the Diploma in Technical Analysis. In Canada the industry is represented by the Canadian Society of Technical Analysts. In Australia, the industry is represented by the Australian Technical Analysts Association (ATAA), (which is affiliated to IFTA) and the Australian Professional Technical Analysts (APTA) Inc.
Professional technical analysis societies have worked on creating a body of knowledge that describes the field of Technical Analysis. A body of knowledge is central to the field as a way of defining how and why technical analysis may work. It can then be used by academia, as well as regulatory bodies, in developing proper research and standards for the field. The CMT Association has published a body of knowledge, which is the structure for the Chartered Market Technician (CMT) exam.
Technical analysis software automates the charting, analysis and reporting functions that support technical analysts in their review and prediction of financial markets (e.g. the stock market).
Since the early 1990s when the first practically usable types emerged, artificial neural networks (ANNs) have rapidly grown in popularity. They are artificial intelligence adaptive software systems that have been inspired by how biological neural networks work. They are used because they can learn to detect complex patterns in data. In mathematical terms, they are universal function approximators, meaning that given the right data and configured correctly, they can capture and model any input-output relationships. This not only removes the need for human interpretation of charts or the series of rules for generating entry/exit signals, but also provides a bridge to fundamental analysis, as the variables used in fundamental analysis can be used as input.
As ANNs are essentially non-linear statistical models, their accuracy and prediction capabilities can be both mathematically and empirically tested. In various studies, authors have claimed that neural networks used for generating trading signals given various technical and fundamental inputs have significantly outperformed buy-hold strategies as well as traditional linear technical analysis methods when combined with rule-based expert systems.
While the advanced mathematical nature of such adaptive systems has kept neural networks for financial analysis mostly within academic research circles, in recent years more user friendly neural network software has made the technology more accessible to traders. However, large-scale application is problematic because of the problem of matching the correct neural topology to the market being studied.
Systematic trading is most often employed after testing an investment strategy on historic data. This is known as backtesting. Backtesting is most often performed for technical indicators, but can be applied to most investment strategies (e.g. fundamental analysis). While traditional backtesting was done by hand, this was usually only performed on human-selected stocks, and was thus prone to prior knowledge in stock selection. With the advent of computers, backtesting can be performed on entire exchanges over decades of historic data in very short amounts of time.
The use of computers does have its drawbacks, being limited to algorithms that a computer can perform. Several trading strategies rely on human interpretation, and are unsuitable for computer processing. Only technical indicators which are entirely algorithmic can be programmed for computerized automated backtesting.
Combination with other market forecast methods
John Murphy states that the principal sources of information available to technicians are price, volume and open interest. Other data, such as indicators and sentiment analysis, are considered secondary.
However, many technical analysts reach outside pure technical analysis, combining other market forecast methods with their technical work. One advocate for this approach is John Bollinger, who coined the term rational analysis in the middle 1980s for the intersection of technical analysis and fundamental analysis. Another such approach, fusion analysis, overlays fundamental analysis with technical, in an attempt to improve portfolio manager performance.
Technical analysis is also often combined with quantitative analysis and economics. For example, neural networks may be used to help identify intermarket relationships.
Investor and newsletter polls, and magazine cover sentiment indicators, are also used by technical analysts.
Whether technical analysis actually works is a matter of controversy. Methods vary greatly, and different technical analysts can sometimes make contradictory predictions from the same data. Many investors claim that they experience positive returns, but academic appraisals often find that it has little predictive power. Of 95 modern studies, 56 concluded that technical analysis had positive results, although data-snooping bias and other problems make the analysis difficult. Nonlinear prediction using neural networks occasionally produces statistically significant prediction results. A Federal Reserve working paper regarding support and resistance levels in short-term foreign exchange rates “offers strong evidence that the levels help to predict intraday trend interruptions”, although the “predictive power” of those levels was “found to vary across the exchange rates and firms examined”.
Technical trading strategies were found to be effective in the Chinese marketplace by a recent study that states, “Finally, we find significant positive returns on buy trades generated by the contrarian version of the moving-average crossover rule, the channel breakout rule, and the Bollinger band trading rule, after accounting for transaction costs of 0.50 percent.”
An influential 1992 study by Brock et al. which appeared to find support for technical trading rules was tested for data snooping and other problems in 1999; the sample covered by Brock et al. was robust to data snooping.
Subsequently, a comprehensive study of the question by Amsterdam economist Gerwin Griffioen concludes that: “for the U.S., Japanese and most Western European stock market indices the recursive out-of-sample forecasting procedure does not show to be profitable, after implementing little transaction costs. Moreover, for sufficiently high transaction costs it is found, by estimating CAPMs, that technical trading shows no statistically significant risk-corrected out-of-sample forecasting power for almost all of the stock market indices.” Transaction costs are particularly applicable to “momentum strategies”; a comprehensive 1996 review of the data and studies concluded that even small transaction costs would lead to an inability to capture any excess from such strategies.
In a paper published in the Journal of Finance, Dr. Andrew W. Lo, director MIT Laboratory for Financial Engineering, working with Harry Mamaysky and Jiang Wang found that:
Technical analysis, also known as “charting”, has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis – the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution – conditioned on specific technical indicators such as head-and-shoulders or double-bottoms – we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
In that same paper Dr. Lo wrote that “several academic studies suggest that … technical analysis may well be an effective means for extracting useful information from market prices.” Some techniques such as Drummond Geometry attempt to overcome the past data bias by projecting support and resistance levels from differing time frames into the near-term future and combining that with reversion to the mean techniques.
The efficient-market hypothesis (EMH) contradicts the basic tenets of technical analysis by stating that past prices cannot be used to profitably predict future prices. Thus it holds that technical analysis cannot be effective. Economist Eugene Fama published the seminal paper on the EMH in the Journal of Finance in 1970, and said “In short, the evidence in support of the efficient markets model is extensive, and (somewhat uniquely in economics) contradictory evidence is sparse.”
Technicians say[who?] that EMH ignores the way markets work, in that many investors base their expectations on past earnings or track record, for example. Because future stock prices can be strongly influenced by investor expectations, technicians claim it only follows that past prices influence future prices. They also point to research in the field of behavioral finance, specifically that people are not the rational participants EMH makes them out to be. Technicians have long said that irrational human behavior influences stock prices, and that this behavior leads to predictable outcomes. Author David Aronson says that the theory of behavioral finance blends with the practice of technical analysis:
By considering the impact of emotions, cognitive errors, irrational preferences, and the dynamics of group behavior, behavioral finance offers succinct explanations of excess market volatility as well as the excess returns earned by stale information strategies…. cognitive errors may also explain the existence of market inefficiencies that spawn the systematic price movements that allow objective TA [technical analysis] methods to work.
EMH advocates reply that while individual market participants do not always act rationally (or have complete information), their aggregate decisions balance each other, resulting in a rational outcome (optimists who buy stock and bid the price higher are countered by pessimists who sell their stock, which keeps the price in equilibrium). Likewise, complete information is reflected in the price because all market participants bring their own individual, but incomplete, knowledge together in the market.
Random walk hypothesis
The random walk hypothesis may be derived from the weak-form efficient markets hypothesis, which is based on the assumption that market participants take full account of any information contained in past price movements (but not necessarily other public information). In his book A Random Walk Down Wall Street, Princeton economist Burton Malkiel said that technical forecasting tools such as pattern analysis must ultimately be self-defeating: “The problem is that once such a regularity is known to market participants, people will act in such a way that prevents it from happening in the future.” Malkiel has stated that while momentum may explain some stock price movements, there is not enough momentum to make excess profits. Malkiel has compared technical analysis to “astrology”.
In the late 1980s, professors Andrew Lo and Craig McKinlay published a paper which cast doubt on the random walk hypothesis. In a 1999 response to Malkiel, Lo and McKinlay collected empirical papers that questioned the hypothesis’ applicability that suggested a non-random and possibly predictive component to stock price movement, though they were careful to point out that rejecting random walk does not necessarily invalidate EMH, which is an entirely separate concept from RWH. In a 2000 paper, Andrew Lo back-analyzed data from the U.S. from 1962 to 1996 and found that “several technical indicators do provide incremental information and may have some practical value”. Burton Malkiel dismissed the irregularities mentioned by Lo and McKinlay as being too small to profit from.
Technicians say[who?] that the EMH and random walk theories both ignore the realities of markets, in that participants are not completely rational and that current price moves are not independent of previous moves. Some signal processing researchers negate the random walk hypothesis that stock market prices resemble Wiener processes, because the statistical moments of such processes and real stock data vary significantly with respect to window size and similarity measure. They argue that feature transformations used for the description of audio and biosignals can also be used to predict stock market prices successfully which would contradict the random walk hypothesis.
The random walk index (RWI) is a technical indicator that attempts to determine if a stock’s price movement is random in nature or a result of a statistically significant trend. The random walk index attempts to determine when the market is in a strong uptrend or downtrend by measuring price ranges over N and how it differs from what would be expected by a random walk (randomly going up or down). The greater the range suggests a stronger trend.
Applying Daniel Kahneman’s Prospect Theory[circular reference] to price movements, Paul V. Azzopardi provided a possible explanation why fear makes prices fall sharply while greed pushes up prices gradually. This commonly observed behaviour of securities prices is sharply at odds with random walk. By gauging greed and fear in the market, investors can better formulate long and short portfolio stances.
Scientific technical analysis
Caginalp and Balenovich in 1994 used their asset-flow differential equations model to show that the major patterns of technical analysis could be generated with some basic assumptions. Some of the patterns such as a triangle continuation or reversal pattern can be generated with the assumption of two distinct groups of investors with different assessments of valuation. The major assumptions of the models are that the finiteness of assets and the use of trend as well as valuation in decision making. Many of the patterns follow as mathematically logical consequences of these assumptions.
One of the problems with conventional technical analysis has been the difficulty of specifying the patterns in a manner that permits objective testing.
Japanese candlestick patterns involve patterns of a few days that are within an uptrend or downtrend. Caginalp and Laurent were the first to perform a successful large scale test of patterns. A mathematically precise set of criteria were tested by first using a definition of a short-term trend by smoothing the data and allowing for one deviation in the smoothed trend. They then considered eight major three-day candlestick reversal patterns in a non-parametric manner and defined the patterns as a set of inequalities. The results were positive with an overwhelming statistical confidence for each of the patterns using the data set of all S&P 500 stocks daily for the five-year period 1992–1996.
Among the most basic ideas of conventional technical analysis is that a trend, once established, tends to continue. However, testing for this trend has often led researchers to conclude that stocks are a random walk. One study, performed by Poterba and Summers, found a small trend effect that was too small to be of trading value. As Fisher Black noted, “noise” in trading price data makes it difficult to test hypotheses.
One method for avoiding this noise was discovered in 1995 by Caginalp and Constantine who used a ratio of two essentially identical closed-end funds to eliminate any changes in valuation. A closed-end fund (unlike an open-end fund) trades independently of its net asset value and its shares cannot be redeemed, but only traded among investors as any other stock on the exchanges. In this study, the authors found that the best estimate of tomorrow’s price is not yesterday’s price (as the efficient-market hypothesis would indicate), nor is it the pure momentum price (namely, the same relative price change from yesterday to today continues from today to tomorrow). But rather it is almost exactly halfway between the two.
Starting from the characterization of the past time evolution of market prices in terms of price velocity and price acceleration, an attempt towards a general framework for technical analysis has been developed, with the goal of establishing a principled classification of the possible patterns characterizing the deviation or defects from the random walk market state and its time translational invariant properties. The classification relies on two dimensionless parameters, the Froude number characterizing the relative strength of the acceleration with respect to the velocity and the time horizon forecast dimensionalized to the training period. Trend-following and contrarian patterns are found to coexist and depend on the dimensionless time horizon. Using a renormalisation group approach, the probabilistic based scenario approach exhibits statistically signifificant predictive power in essentially all tested market phases.
A survey of modern studies by Park and Irwin showed that most found a positive result from technical analysis.
In 2011, Caginalp and DeSantis have used large data sets of closed-end funds, where comparison with valuation is possible, in order to determine quantitatively whether key aspects of technical analysis such as trend and resistance have scientific validity. Using data sets of over 100,000 points they demonstrate that trend has an effect that is at least half as important as valuation. The effects of volume and volatility, which are smaller, are also evident and statistically significant. An important aspect of their work involves the nonlinear effect of trend. Positive trends that occur within approximately 3.7 standard deviations have a positive effect. For stronger uptrends, there is a negative effect on returns, suggesting that profit taking occurs as the magnitude of the uptrend increases. For downtrends the situation is similar except that the “buying on dips” does not take place until the downtrend is a 4.6 standard deviation event. These methods can be used to examine investor behavior and compare the underlying strategies among different asset classes.
In 2013, Kim Man Lui and T Chong pointed out that the past findings on technical analysis mostly reported the profitability of specific trading rules for a given set of historical data. These past studies had not taken the human trader into consideration as no real-world trader would mechanically adopt signals from any technical analysis method. Therefore, to unveil the truth of technical analysis, we should get back to understand the performance between experienced and novice traders. If the market really walks randomly, there will be no difference between these two kinds of traders. However, it is found by experiment that traders who are more knowledgeable on technical analysis significantly outperform those who are less knowledgeable.
Main article: Ticker tape
Until the mid-1960s, tape reading was a popular form of technical analysis. It consisted of reading market information such as price, volume, order size, and so on from a paper strip which ran through a machine called a stock ticker. Market data was sent to brokerage houses and to the homes and offices of the most active speculators. This system fell into disuse with the advent of electronic information panels in the late 60’s, and later computers, which allow for the easy preparation of charts.
Jesse Livermore, one of the most successful stock market operators of all time, was primarily concerned with ticker tape reading since a young age. He followed his own (mechanical) trading system (he called it the ‘market key’), which did not need charts, but was relying solely on price data. He described his market key in detail in his 1940s book ‘How to Trade in Stocks’. Livermore’s system was determining market phases (trend, correction etc.) via past price data. He also made use of volume data (which he estimated from how stocks behaved and via ‘market testing’, a process of testing market liquidity via sending in small market orders), as described in his 1940s book.
Another form of technical analysis used so far was via interpretation of stock market data contained in quotation boards, that in the times before electronic screens, were huge chalkboards located in the stock exchanges, with data of the main financial assets listed on exchanges for analysis of their movements. It was manually updated with chalk, with the updates regarding some of these data being transmitted to environments outside of exchanges (such as brokerage houses, bucket shops, etc.) via the aforementioned tape, telegraph, telephone and later telex.
This analysis tool was used both, on the spot, mainly by market professionals for day trading and scalping, as well as by general public through the printed versions in newspapers showing the data of the negotiations of the previous day, for swing and position trades.
Charting terms and indicators
Average true range – averaged daily trading range, adjusted for price gaps.
Breakout – the concept whereby prices forcefully penetrate an area of prior support or resistance, usually, but not always, accompanied by an increase in volume.
Chart pattern – distinctive pattern created by the movement of security or commodity prices on a chart
Cycles – time targets for potential change in price action (price only moves up, down, or sideways)
Dead cat bounce – the phenomenon whereby a spectacular decline in the price of a stock is immediately followed by a moderate and temporary rise before resuming its downward movement
Elliott wave principle and the golden ratio to calculate successive price movements and retracements
Fibonacci ratios – used as a guide to determine support and resistance
Momentum – the rate of price change
Point and figure analysis – A priced-based analytical approach employing numerical filters which may incorporate time references, though ignores time entirely in its construction
Resistance – a price level that may prompt a net increase of selling activity
Support – a price level that may prompt a net increase of buying activity
Trending – the phenomenon by which price movement tends to persist in one direction for an extended period of time
Types of charts
Candlestick chart – Of Japanese origin and similar to OHLC, candlesticks widen and fill the interval between the open and close prices to emphasize the open/close relationship. In the West, often black or red candle bodies represent a close lower than the open, while white, green or blue candles represent a close higher than the open price.
Line chart – Connects the closing price values with line segments. You can also choose to draw the line chart using open, high or low price.
Open-high-low-close chart – OHLC charts, also known as bar charts, plot the span between the high and low prices of a trading period as a vertical line segment at the trading time, and the open and close prices with horizontal tick marks on the range line, usually a tick to the left for the open price and a tick to the right for the closing price.
Point and figure chart – a chart type employing numerical filters with only passing references to time, and which ignores time entirely in its construction.
Overlays are generally superimposed over the main price chart.
Bollinger bands – a range of price volatility
Channel – a pair of parallel trend lines
Ichimoku kinko hyo – a moving average-based system that factors in time and the average point between a candle’s high and low
Moving average – an average over a window of time before and after a given time point that is repeated at each time point in the given chart. A moving average can be thought of as a kind of dynamic trend-line.
Parabolic SAR – Wilder’s trailing stop based on prices tending to stay within a parabolic curve during a strong trend
Pivot point – derived by calculating the numerical average of a particular currency’s or stock’s high, low and closing prices
Resistance – a price level that may act as a ceiling above price
Support – a price level that may act as a floor below price
Trend line – a sloping line described by at least two peaks or two troughs
Zig Zag – This chart overlay that shows filtered price movements that are greater than a given percentage.
These indicators are based on statistics derived from the broad market.
Advance–decline line – a popular indicator of market breadth.
McClellan Oscillator – a popular closed-form indicator of breadth.
McClellan Summation Index – a popular open-form indicator of breadth.
These indicators are generally shown below or above the main price chart.
Average directional index – a widely used indicator of trend strength.
Commodity channel index – identifies cyclical trends.
MACD – moving average convergence/divergence.
Momentum – the rate of price change.
Relative strength index (RSI) – oscillator showing price strength.
Relative Vigor Index (RVI) – oscillator measures the conviction of a recent price action and the likelihood that it will continue.
Stochastic oscillator – close position within recent trading range.
Trix – an oscillator showing the slope of a triple-smoothed exponential moving average.
Vortex Indicator – an indicator used to identify the existence, continuation, initiation or termination of trends.
Accumulation/distribution index – based on the close within the day’s range.
Money flow index – the amount of stock traded on days the price went up.
On-balance volume – the momentum of buying and selling stocks.
Trading with Mixing Indicators
MACD & Average directional index
MACD & Super Trend
MACD & Moving average
MACD & RSI
MACD & Moving Averages
Key words to search
Certified Financial Technician
Chartered Market Technician
Financial signal processing
Multimedia information retrieval
Multiple comparisons problem
Price action trading
Texas sharpshooter fallacy
Systematic trading (also known as mechanical trading) is a way of defining trade goals, risk controls and rules that can make investment and trading decisions in a methodical way.
Systematic trading includes both manual trading of systems, and full or partial automation using computers. Although technical systematic systems are more common, there are also systems using fundamental data such as those in equity long:short hedge funds and GTAA funds. Systematic trading includes both high frequency trading (HFT, sometimes called algorithmic trading) and slower types of investment such as systematic trend following. It also includes passive index tracking.
The opposite of systematic trading is discretionary trading. The disadvantage of discretionary trading is that it may be influenced by emotions, isn’t easily back tested, and has less rigorous risk control.
Systematic trading is related to quantitative trading. Quantitative trading includes all trading which use quantitative techniques; most quantitative trading involves using techniques to value market assets like derivatives but the trading decision may be systematic or discretionary.
2.2 Risk management
4 Key words to search
Suppose we need to replicate an index with futures and stocks from other markets with higher liquidity level. An example of systematic approach would be:
Identify, using fundamental analysis, which stocks and futures should be used for replication.
Analyze correlations between targeted index and selected stocks and futures, looking for the strategy which provides a better approximation to index.
Define a coherent strategy to combine dynamically stocks and futures according to market data.
Simulate the strategy including transaction costs, rollovers, stop-loss orders and all other wanted risk controls.
Apply the strategy in the real world using algorithmic trading for signal generation and trying to optimize the P&L, controlling continuously the risks.
Following the ideas of Irene Aldridge’s, who describes a specific HFT system, a more general systematic trading system should include these elements:
Data management (in real time and for backtesting purposes)
A signal generation system (to create, buy and sell signals according to predefined strategies using quantitative methods)
A portfolio and P&L tracking system
A quantitative risk management system (defining exposure per market, group, or portfolio)
A routing and execution subsystem (usually containing execution trading algorithms, like TWAP, VWAP…)
The key point in systematic trading is the use of backtests to verify (at least partially) strategies and alternatives. It’s a basic point in backtesting to have easy and robust access to trading data.
Systematic trading should take into account the importance of risk management, using a systematic approach to quantify risk, consistent limits and techniques to define how to close excessively risky positions.
Systematic trading, in fact, lends itself to control risk precisely because it allows money managers to define profit targets, loss points, trade size, and system shutdown points objectively and in advance of entering each trade.
2.S ystematic trading, benefits and risks
Key words to search
Stock selection criterion
Algorithmic Traders Association