with Dimitris Bertsimas and Paul Hummel, Computing in Science & Engineering 1 (2000), 40–53.
The dramatic growth in institutionally managed assets, coupled with the advent of internet trading and electronic brokerage for retail investors, has led to a surge in the size and volume of trading. At the same time, competition in the asset management industry has increased to the point where fractions of a percent in performance can separate the top funds from those in the next tier. In this environment, portfolio managers have begun to explore active management of trading costs as a means of boosting returns. Controlling execution cost can be viewed as a stochastic dynamic optimization problem because trading takes time, stock prices exhibit random fluctuations, and execution prices depend on trade size, order flow, and market conditions. In this paper, we apply stochastic dynamic programming to derive trading strategies that minimize the expected cost of executing a portfolio of securities over a fixed period of time, i.e., we derive the optimal sequence of trades as a function of prices, quantitites, and other market conditions. To illustrate the practical relevance of our methods, we apply them to a hypothetical portfolio of 25 stocks by estimating their price-impact functions using historical trade data from 1996 and deriving the optimal execution strategies. We also perform several Monte Carlo simulation experiments to compare the performance of the optimal strategy to several alternatives.
with Jiang Wang, Review of Financial Studies 13 (2000), 257–300.
We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. We begin by showing that a two-fund separation theorem suggests a natural definition for trading volume: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that share turnover satisfies and approximate linear K-factor structure, These implications are empirically tested using weekly turnover data for NYSE and AMEX securities from 1962 to 1996. We find strong evidence against two-fund separation and an eigenvalue decomposition suggests that volume is driven by a two-factor linear model.
with Dimitris Bertsimas and Leonid Kogan, Journal of Financial Economics 55 (2000), 173–204.
In this paper we study the tracking error resulting from the discrete-time application of continuous-time delta-hedging procedures for European options. We characterize the asymptotic distribution of the tracking error as the number of discrete time periods increases, and its joint distribution with other assets. We introduce the notion of temporal granularity of the continuous-time stochastic model that enables us to characterize the degree to which discrete-time approximations of continuous time models track the payoff of the option. We derive closed form expressions for the granularity for a put and call option on a stock that follows a geometric Brownian motion and a mean-reverting process. These expressions offer insight into the tracking error involved in applying continuous-time delta-hedging in discrete time. We also introduce alternative measures of the tracking error and analyze their properties.
Journal of the American Statistical Association 95 (2000), 629-635.
Ever since the publication in 1565 of Girolamo Cardano's treatise on gambling, Liber de Ludo Aleae (The Book of Games of Chance), statistics and financial markets have become inextricably linked. Over the past few decades many of these links have become part of the canon of modern finance, and it is now impossible to fully appreciate the workings of financial markets without them. This selective survey covers three of the most important ideas of finance—efficient markets, the random walk hypothesis, and derivative pricing models—that illustrate the enormous research opportunities that lie at the intersection of finance and statistics.
Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation
with Harry Mamaysky and Jiang Wang, Journal of Finance 55 (2000), 1705–1765.
Technical analysis, also known as "charting,'' has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis—the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution—conditioned on specific technical indicators such as head-and-shoulders or double-bottoms—we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
Financial Analysts Journal 55 (1999), 13–26.
Current risk-management practices are based on probabilities of extreme dollar losses (e.g., measures like Value at Risk), but these measures capture only part of the story. Any complete risk-management system must address two other important factors: prices and preferences. Together with probabilities, these comprise the three P's of Total Risk Management. This article describes how the three Ps interact to determine sensible risk profiles for corporations and for individuals, guidelines for how much risk to bear and how much to hedge. By synthesizing existing research in economics, psychology, and decision sciences, and through an ambitious research agenda to extend this synthesis into other disciplines, a complete and systematic approach to rational decision-making in an uncertain world is within reach.
with J. Doyne Farmer, Proceedings of the National Academy of Sciences 96 (1999), 9991–9992.
In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the "law" of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun.
with Dimitris Bertsimas, Journal of Financial Markets 1 (1998), 1–50.
We derive dynamic optimal trading strategies that minimize the expected cost of trading a large block of equity over a fixed time horizon. Specifically, given a fixed block S of shares to be executed within a fixed finite number of periods T, and given a price-impact function that yields the execution price of an individual trade as a function of the shares traded and market conditions, we obtain the optimal sequence of trades as a function of market conditions—closed-form expressions in some cases—that minimizes the expected cost of executing S within T periods. Our analysis is extended to the portfolio case in which price impact across stocks can have an important effect on the total cost of trading a portfolio.
with Yacine Ait-Sahalia, Journal of Finance 52 (1998), 499–548.
Implicit in the prices of traded financial assets are Arrow-Debreu state prices or, in the continuous-state case, the state-price density [SPD]. We construct an estimator for the SPD implicit in option prices and derive an asymptotic sampling theory for this estimator to gauge its accuracy. The SPD estimator provides an arbitrage-free method of pricing new, more complex, or less liquid securities while capturing those features of the data that are most relevant from an asset-pricing perspective, e.g., negative skewness and excess kurtosis for asset returns, volatility "smiles" for option prices. We perform Monte Carlo simulation experiments to show that the SPD estimator can be successfully extracted from option prices and we present an empirical application using S&P 500 index options.
Research Dialogues 52 (1997) TIAA-CREF
In the ’50s and ’60s, just as the era of the professional portfolio manager was dawning, financial economists were telling anyone who would listen that active management was probably a big mistake—a waste of time and money. Their research demonstrated that historical prices were of little use in helping to predict where future prices would go. Prices simply took a “random walk.” The better part of wisdom, they advised, was to be a passive investor. At first, not too many of the people who influence the way money is managed (those who select managers of large portfolios) listened. But as time went on, it became apparent that they should have. Because of fees and turnover, the managers they picked typically underperformed the market. And the worse an active manager did relative to a market index, the more attractive seemed the low cost alternative of buying and holding the index itself. But as luck would have it, just as indexing was gaining ground, a new wave of academic research was being published that weakened some of the results of the earlier research and thereby undercut part of the justification for indexing. It didn’t obviate all the reasons for indexing (indexing was still a low-cost way to create diversification for an entire fund or as part of an active/passive strategy), but it did tend to silence the index-because-you-can’t-do better school.