Asset Allocation and Derivatives
with Martin Haugh, Quantitative Finance 1 (2001), 45–72.
The fact that derivative securities are equivalent to specific dynamic trading strategies in complete markets suggests the possibility of constructing buy-and-hold portfolios of options that mimic certain dynamic investment policies, e.g., asset-allocation rules. We explore this possibility by solving the following problem: given an optimal dynamic investment policy, find a set of options at the start of the investment horizon which will come closest to the optimal dynamic investment policy. We solve this problem for several combinations of preferences, return dynamics, and optimality criteria, and show that under certain conditions, a portfolio consisting of just a few options is an excellent substitute for considerably more complex dynamic investment policies.
Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation
with Harry Mamaysky and Jiang Wang, Journal of Finance 55 (2000), 1705–1765.
Technical analysis, also known as "charting,'' has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis—the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution—conditioned on specific technical indicators such as head-and-shoulders or double-bottoms—we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
Finance: A Selective Survey
Journal of the American Statistical Association 95 (2000), 629-635.
Ever since the publication in 1565 of Girolamo Cardano's treatise on gambling, Liber de Ludo Aleae (The Book of Games of Chance), statistics and financial markets have become inextricably linked. Over the past few decades many of these links have become part of the canon of modern finance, and it is now impossible to fully appreciate the workings of financial markets without them. This selective survey covers three of the most important ideas of finance—efficient markets, the random walk hypothesis, and derivative pricing models—that illustrate the enormous research opportunities that lie at the intersection of finance and statistics.
When Is Time Continuous?
with Dimitris Bertsimas and Leonid Kogan, Journal of Financial Economics 55 (2000), 173–204.
In this paper we study the tracking error resulting from the discrete-time application of continuous-time delta-hedging procedures for European options. We characterize the asymptotic distribution of the tracking error as the number of discrete time periods increases, and its joint distribution with other assets. We introduce the notion of temporal granularity of the continuous-time stochastic model that enables us to characterize the degree to which discrete-time approximations of continuous time models track the payoff of the option. We derive closed form expressions for the granularity for a put and call option on a stock that follows a geometric Brownian motion and a mean-reverting process. These expressions offer insight into the tracking error involved in applying continuous-time delta-hedging in discrete time. We also introduce alternative measures of the tracking error and analyze their properties.
Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory
with Jiang Wang, Review of Financial Studies 13 (2000), 257–300.
We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. We begin by showing that a two-fund separation theorem suggests a natural definition for trading volume: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that share turnover satisfies and approximate linear K-factor structure, These implications are empirically tested using weekly turnover data for NYSE and AMEX securities from 1962 to 1996. We find strong evidence against two-fund separation and an eigenvalue decomposition suggests that volume is driven by a two-factor linear model.
Optimal Control of Execution Costs for Portfolios
with Dimitris Bertsimas and Paul Hummel, Computing in Science & Engineering 1 (2000), 40–53.
The dramatic growth in institutionally managed assets, coupled with the advent of internet trading and electronic brokerage for retail investors, has led to a surge in the size and volume of trading. At the same time, competition in the asset management industry has increased to the point where fractions of a percent in performance can separate the top funds from those in the next tier. In this environment, portfolio managers have begun to explore active management of trading costs as a means of boosting returns. Controlling execution cost can be viewed as a stochastic dynamic optimization problem because trading takes time, stock prices exhibit random fluctuations, and execution prices depend on trade size, order flow, and market conditions. In this paper, we apply stochastic dynamic programming to derive trading strategies that minimize the expected cost of executing a portfolio of securities over a fixed period of time, i.e., we derive the optimal sequence of trades as a function of prices, quantitites, and other market conditions. To illustrate the practical relevance of our methods, we apply them to a hypothetical portfolio of 25 stocks by estimating their price-impact functions using historical trade data from 1996 and deriving the optimal execution strategies. We also perform several Monte Carlo simulation experiments to compare the performance of the optimal strategy to several alternatives.
An Econometric Model of Serial Correlation and Illiquidity in Hedge-Fund Returns
with Mila Getmansky and Igor Makarov, Journal of Financial Economics 74 (2004), 529–609.
The returns to hedge funds and other alternative investments are often highly serially correlated in sharp contrast to the returns of more traditional investment vehicles such as long-only equity portfolios and mutual funds. In this paper, we explore several sources of such serial correlation and show that the most likely explanation is illiquidity exposure, i.e., investments in securities that are not actively traded and for which market prices are not always readily available. For portfolios of illiquid securities, reported returns will tend to be smoother than true economic returns, which will understate volatility and increase risk-adjusted performance measures such as the Sharpe ratio. We propose an econometric model of illiquidity exposure and develop estimators for the smoothing profile as well as a smoothing-adjusted Sharpe ratio. For a sample of 908 hedge funds drawn from the TASS database, we show that our estimated smoothing coefficients vary considerably across hedge-fund style categories and may be a useful proxy for quantifying illiquidity exposure.
Nonparametric Risk Management and Implied Risk Aversion
with Yacine Ait-Sahalia, Journal of Econometrics 94 (2000), 9–51.
Typical value-at-risk (VAR) calculations involve the probabilities of extreme dollar losses, based on the statistical distributions of market prices. Such quantities do not account for the fact that the same dollar loss can have two very different economic valuations, depending on business conditions. We propose a nonparametric VAR measure that incorporates economic valuation according to the state-price density associated with the underlying price processes. The state-price density yields VAR values that are adjusted for risk aversion, time preferences, and other variations in economic valuation. In the context of a representative agent equilibrium model, we construct an estimator of the risk-aversion coefficient that is implied by the joint observations on option prices and underlying asset value.
The National Transportation Safety Board: A Model for Systemic Risk Management
with Eric Fielding and Jian Helen Yang, Journal of Investment Management 9 (2011), 17-49.
We propose the National Transportation Safety Board (NTSB) as a model organization for addressing systemic risk in industries and contexts other than transportation. When adopted by regulatory agencies and the transportation industry, the safety recommendations of the NTSB have been remarkably effective in reductin the number of fatalities in various modes of transportation since the NTSB's inception in 1967 as an independent agency. Formerly part of the Civil Aeronautics Board (now the Federal Aviation Administration), the NTSB has no regulatory authority and is solely focused on conducting forensic investigations of transportation accidents and proposing safety recommendations. With only 400 full-time employees, the NTSB has a much larger network of experts drawn from other government agencies and the private sector who are on call to assist in accident investigations on an as-needed basis. By allowing and encouraging the participation of all interested parties in its investigations, the NTSB is able to produce definitive analyses of even the most complex accidents and provide genuinely actionable measures for reducing the chances of future accidents. We believe it is possible to create more efficient and effective systemic-risk management processes in many other industries, including the financial services industry, by studying the organizational structure and functions of the NTSB.
What Happened To The Quants In August 2007?: Evidence from Factors and Transactions Data
with Amir Khandani, Journal of Financial Markets 14 (2011), 1-46.
During the week of August 6, 2007, a number of quantitative long/short equity hedge funds experienced unprecedented losses. It has been hypothesized that a coordinated deleveraging of similarly constructed portfolios caused this temporary dislocation in the market. Using the simulated returns of long/short equity portfolios based on five specific valuation factors, we find evidence that the unwinding of these portfolios began in July 2007 and continued until the end of 2007. Using transactions data, we find that the simulated returns of a simple market-making strategy were significantly negative during the week of August 6, 2007, but positive before and after, suggesting that the Quant Meltdown of August 2007 was the combined effects of portfolio deleveraging throughout July and the first week of August, and a temporary withdrawal of market-making risk capital starting August 8th. Our simulations point to two unwinds—a mini-unwind on August 1st starting at 10:45am and ending at 11:30am, and a more sustained unwind starting at the open on August 6th and ending at 1:00pm—that began with stocks in the financial sector and long Book-to-Market and short Earnings Momentum. These conjectures have significant implications for the systemic risks posed by the hedge-fund industry.