Research
Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation
Lo, Andrew W., Harry Mamaysky, and Jiang Wang (2000), Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation, Journal of Finance 55 (4), 1705–1765.
View abstract
Hide abstract
Technical analysis, also known as "charting,'' has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis—the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution—conditioned on specific technical indicators such as head-and-shoulders or double-bottoms—we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
Lo, Andrew W. (2001), Finance: A Selective Survey, In Statistics in the 21st Century, edited by Adrian E. Raftery, Martin A. Tanner, and Martin T. Wells, 102–114.
View abstract
Hide abstract
Ever since the publication in 1565 of Girolamo Cardano's treatise on gambling, Liber de Ludo Aleae (The Book of Games of Chance), statistics and financial markets have become inextricably linked. Over the past few decades many of these links have become part of the canon of modern finance, and it is now impossible to fully appreciate the workings of financial markets without them. This selective survey covers three of the most important ideas of finance—efficient markets, the random walk hypothesis, and derivative pricing models—that illustrate the enormous research opportunities that lie at the intersection of finance and statistics.
Bertsimas, Dimitris, Leonid Kogan, and Andrew W. Lo (2000), When Is Time Continuous?, Journal of Financial Economics 55 (2), 173–204.
View abstract
Hide abstract
In this paper we study the tracking error resulting from the discrete-time application of continuous-time delta-hedging procedures for European options. We characterize the asymptotic distribution of the tracking error as the number of discrete time periods increases, and its joint distribution with other assets. We introduce the notion of temporal granularity of the continuous-time stochastic model that enables us to characterize the degree to which discrete-time approximations of continuous time models track the payoff of the option. We derive closed form expressions for the granularity for a put and call option on a stock that follows a geometric Brownian motion and a mean-reverting process. These expressions offer insight into the tracking error involved in applying continuous-time delta-hedging in discrete time. We also introduce alternative measures of the tracking error and analyze their properties.
Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory
Lo, Andrew W., and Jiang Wang (2000), Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory, Review of Financial Studies 13 (2), 257–300.
View abstract
Hide abstract
We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. We begin by showing that a two-fund separation theorem suggests a natural definition for trading volume: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that share turnover satisfies and approximate linear K-factor structure, These implications are empirically tested using weekly turnover data for NYSE and AMEX securities from 1962 to 1996. We find strong evidence against two-fund separation and an eigenvalue decomposition suggests that volume is driven by a two-factor linear model. Click here to download Trading Volume and the MiniCRSP Database: An Introduction and User’s Guide for instructions on how to create your own MiniCRSP database.
Bertsimas, Dimitris, Andrew W. Lo, and Paul Hummel (2000), Optimal Control of Execution Costs for Portfolios, Computing in Science & Engineering 1, 40–53.
View abstract
Hide abstract
The dramatic growth in institutionally managed assets, coupled with the advent of internet trading and electronic brokerage for retail investors, has led to a surge in the size and volume of trading. At the same time, competition in the asset management industry has increased to the point where fractions of a percent in performance can separate the top funds from those in the next tier. In this environment, portfolio managers have begun to explore active management of trading costs as a means of boosting returns. Controlling execution cost can be viewed as a stochastic dynamic optimization problem because trading takes time, stock prices exhibit random fluctuations, and execution prices depend on trade size, order flow, and market conditions. In this paper, we apply stochastic dynamic programming to derive trading strategies that minimize the expected cost of executing a portfolio of securities over a fixed period of time, i.e., we derive the optimal sequence of trades as a function of prices, quantitites, and other market conditions. To illustrate the practical relevance of our methods, we apply them to a hypothetical portfolio of 25 stocks by estimating their price-impact functions using historical trade data from 1996 and deriving the optimal execution strategies. We also perform several Monte Carlo simulation experiments to compare the performance of the optimal strategy to several alternatives.
Aït-Sahalia, Yacine, and Andrew W. Lo (2000), Nonparametric Risk Management and Implied Risk Aversion, Journal of Econometrics 94 (1–2), 9–51.
View abstract
Hide abstract
Typical value-at-risk (VAR) calculations involve the probabilities of extreme dollar losses, based on the statistical distributions of market prices. Such quantities do not account for the fact that the same dollar loss can have two very different economic valuations, depending on business conditions. We propose a nonparametric VAR measure that incorporates economic valuation according to the state-price density associated with the underlying price processes. The state-price density yields VAR values that are adjusted for risk aversion, time preferences, and other variations in economic valuation. In the context of a representative agent equilibrium model, we construct an estimator of the risk-aversion coefficient that is implied by the joint observations on option prices and underlying asset value.
Farmer, J. Doyne, and Andrew W. Lo (1999), Frontiers of Finance: Evolution and Efficient Markets, Proceedings of the National Academy of Sciences 96, 9991–9992.
View abstract
Hide abstract
In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the "law" of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun.
Lo, Andrew W. (1999), The Three P’s of Total Risk Management, Financial Analysts Journal 55 (1), 13–26.
View abstract
Hide abstract
Current risk-management practices are based on probabilities of extreme dollar losses (e.g., measures like Value at Risk), but these measures capture only part of the story. Any complete risk-management system must address two other important factors: prices and preferences. Together with probabilities, these comprise the three P's of Total Risk Management. This article describes how the three Ps interact to determine sensible risk profiles for corporations and for individuals, guidelines for how much risk to bear and how much to hedge. By synthesizing existing research in economics, psychology, and decision sciences, and through an ambitious research agenda to extend this synthesis into other disciplines, a complete and systematic approach to rational decision-making in an uncertain world is within reach.
Illiquidity Premia in Asset Returns: An Empirical Analysis of Hedge Funds, Mutual Funds, and US Equity Portfolios
Khandani, Amir E., and Andrew W. Lo (2011), Illiquidity Premia in Asset Returns: An Empirical Analysis of Hedge Funds, Mutual Funds, and US Equity Portfolios, Quarterly Journal of Finance 1 (2), 205–264.
View abstract
Hide abstract
We establish a link between illiquidity and positive autocorrelation in asset returns among a sample of hedge funds, mutual funds, and various equity portfolios. For hedge funds, this link can be confirmed by comparing the return autocorrelations of funds with shorter vs. longer redemption-notice periods. We also document significant positive return-autocorrelation in portfolios of securities that are generally considered less liquid, e.g., small-cap stocks, corporate bonds, mortgage-backed securities, and emerging-market investments. Using a sample of 2,927 hedge funds, 15,654 mutual funds, and 100 size- and book-to-market-sorted portfolios of U.S. common stocks, we construct autocorrelation-sorted long/short portfolios and conclude that illiquidity premia are generally positive and significant, ranging from 2.74% to 9.91% per year among the various hedge funds and fixed-income mutual funds. We do not find evidence for this premium among equity and asset-allocation mutual funds, or among the 100 U.S. equity portfolios. The time variation in our aggregated illiquidity premium shows that while 1998 was a difficult year for most funds with large illiquidity exposure, the following four years yielded significantly higher illiquidity premia that led to greater competition in credit markets, contributing to much lower illiquidity premia in the years leading up to the Financial Crisis of 2007–2008.
Brennan, Thomas J., and Andrew W. Lo (2011), The Origin of Behavior, Quarterly Journal of Finance 1 (1), 55–108.
View abstract
Hide abstract
We propose a single evolutionary explanation for the origin of several behaviors that have been observed in organisms ranging from ants to human subjects, including risk-sensitive foraging, risk aversion, loss aversion, probability matching, randomization, and diversification. Given an initial population of individuals, each assigned a purely arbitrary behavior with respect to a binary choice problem, and assuming that offspring behave identically to their parents, only those behaviors linked to reproductive success will survive, and less reproductively successful behaviors will disappear at exponential rates. This framework generates a surprisingly rich set of behaviors, and the simplicity and generality of our model suggest that these behaviors are primitive and universal.