Publications
Optimal Control of Execution Costs for Portfolios
2000The dramatic growth in institutionally managed assets, coupled with the advent of internet trading and electronic brokerage for retail investors, has led to a surge in the size and volume of trading. At the same time, competition in the asset management industry has increased to the point where fractions of a percent in performance can separate the top funds from those in the next tier. In this environment, portfolio managers have begun to explore active management of trading costs as a means of boosting returns. Controlling execution cost can be viewed as a stochastic dynamic optimization problem because trading takes time, stock prices exhibit random fluctuations, and execution prices depend on trade size, order flow, and market conditions. In this paper, we apply stochastic dynamic programming to derive trading strategies that minimize the expected cost of executing a portfolio of securities over a fixed period of time, i.e., we derive the optimal sequence of trades as a function of prices, quantitites, and other market conditions. To illustrate the practical relevance of our methods, we apply them to a hypothetical portfolio of 25 stocks by estimating their price-impact functions using historical trade data from 1996 and deriving the optimal execution strategies. We also perform several Monte Carlo simulation experiments to compare the performance of the optimal strategy to several alternatives.
Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory
2000We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. We begin by showing that a two-fund separation theorem suggests a natural definition for trading volume: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that share turnover satisfies and approximate linear K-factor structure, These implications are empirically tested using weekly turnover data for NYSE and AMEX securities from 1962 to 1996. We find strong evidence against two-fund separation and an eigenvalue decomposition suggests that volume is driven by a two-factor linear model. Click here to download Trading Volume and the MiniCRSP Database: An Introduction and User’s Guide for instructions on how to create your own MiniCRSP database.
When Is Time Continuous?
2000In this paper we study the tracking error resulting from the discrete-time application of continuous-time delta-hedging procedures for European options. We characterize the asymptotic distribution of the tracking error as the number of discrete time periods increases, and its joint distribution with other assets. We introduce the notion of temporal granularity of the continuous-time stochastic model that enables us to characterize the degree to which discrete-time approximations of continuous time models track the payoff of the option. We derive closed form expressions for the granularity for a put and call option on a stock that follows a geometric Brownian motion and a mean-reverting process. These expressions offer insight into the tracking error involved in applying continuous-time delta-hedging in discrete time. We also introduce alternative measures of the tracking error and analyze their properties.
Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation
2000Technical analysis, also known as "charting,'' has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis—the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution—conditioned on specific technical indicators such as head-and-shoulders or double-bottoms—we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
A Non-Random Walk Down Wall Street
1999For over half a century, financial experts have regarded the movements of markets as a random walk--unpredictable meanderings akin to a drunkard's unsteady gait--and this hypothesis has become a cornerstone of modern financial economics and many investment strategies. Here Andrew W. Lo and A. Craig MacKinlay put the Random Walk Hypothesis to the test. In this volume, which elegantly integrates their most important articles, Lo and MacKinlay find that markets are not completely random after all, and that predictable components do exist in recent stock and bond returns. Their book provides a state-of-the-art account of the techniques for detecting predictabilities and evaluating their statistical and economic significance, and offers a tantalizing glimpse into the financial technologies of the future.
The Three P’s of Total Risk Management
1999Current risk-management practices are based on probabilities of extreme dollar losses (e.g., measures like Value at Risk), but these measures capture only part of the story. Any complete risk-management system must address two other important factors: prices and preferences. Together with probabilities, these comprise the three P's of Total Risk Management. This article describes how the three Ps interact to determine sensible risk profiles for corporations and for individuals, guidelines for how much risk to bear and how much to hedge. By synthesizing existing research in economics, psychology, and decision sciences, and through an ambitious research agenda to extend this synthesis into other disciplines, a complete and systematic approach to rational decision-making in an uncertain world is within reach.
Frontiers of Finance: Evolution and Efficient Markets
1999In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the "law" of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun.
Trading Volume and the MiniCRSP Database: An Introduction and User’s Guide
1998This guide provides details on how to access the MiniCRSP database and reports the results of some exploratory data analysis of trading volume. MiniCRSP contains daily as well as weekly-aggregated data derived from the CRSP Stocks daily file. MiniCRSP comprises returns, turnover, and other data items of research interest, at daily and weekly frequencies, stored in a format such that storage space and access times are minimized. A set of access routines is provided to enable the data to be read via either sequential and random access methods on almost any machine platform.
Optimal Control of Execution Costs
1998We derive dynamic optimal trading strategies that minimize the expected cost of trading a large block of equity over a fixed time horizon. Specifically, given a fixed block S of shares to be executed within a fixed finite number of periods T, and given a price-impact function that yields the execution price of an individual trade as a function of the shares traded and market conditions, we obtain the optimal sequence of trades as a function of market conditions—closed-form expressions in some cases—that minimizes the expected cost of executing S within T periods. Our analysis is extended to the portfolio case in which price impact across stocks can have an important effect on the total cost of trading a portfolio.
Nonparametric Estimation of State-Price Densities Implicit In Financial Asset Prices
1998Implicit in the prices of traded financial assets are Arrow-Debreu state prices or, in the continuous-state case, the state-price density [SPD]. We construct an estimator for the SPD implicit in option prices and derive an asymptotic sampling theory for this estimator to gauge its accuracy. The SPD estimator provides an arbitrage-free method of pricing new, more complex, or less liquid securities while capturing those features of the data that are most relevant from an asset-pricing perspective, e.g., negative skewness and excess kurtosis for asset returns, volatility "smiles" for option prices. We perform Monte Carlo simulation experiments to show that the SPD estimator can be successfully extracted from option prices and we present an empirical application using S&P 500 index options.
The Econometrics of Financial Markets
1997The past twenty years have seen an extraordinary growth in the use of quantitative methods in financial markets. Finance professionals now routinely use sophisticated statistical techniques in portfolio management, proprietary trading, risk management, financial consulting, and securities regulation. This graduate-level textbook is intended for PhD students, advanced MBA students, and industry professionals interested in the econometrics of financial modeling. The book covers the entire spectrum of empirical finance, including: the predictability of asset returns, tests of the Random Walk Hypothesis, the microstructure of securities markets, event analysis, the Capital Asset Pricing Model and the Arbitrage Pricing Theory, the term structure of interest rates, dynamic models of economic equilibrium, and nonlinear financial models such as ARCH, neural networks, statistical fractals, and chaos theory.
Market Efficiency: Stock Market Behaviour In Theory and Practice, Volumes I & II
1997The efficient markets hypothesis is one of the most controversial and hotly contested ideas in all the social sciences. It is disarmingly simply to state, has far-reaching consequences for academic pursuits and business practice, and yet is surprisingly resilient to empirical proof or refutation. Even after three decades of research and literally thousands of journal articles, economists have not yet reached a consensus about whether markets - particularly financial markets - are efficient or not.
A Nonrandom Walk Down Wall Street: Recent Advances in Financial Technology
1997In the ’50s and ’60s, just as the era of the professional portfolio manager was dawning, financial economists were telling anyone who would listen that active management was probably a big mistake—a waste of time and money. Their research demonstrated that historical prices were of little use in helping to predict where future prices would go. Prices simply took a “random walk.” The better part of wisdom, they advised, was to be a passive investor. At first, not too many of the people who influence the way money is managed (those who select managers of large portfolios) listened. But as time went on, it became apparent that they should have. Because of fees and turnover, the managers they picked typically underperformed the market. And the worse an active manager did relative to a market index, the more attractive seemed the low cost alternative of buying and holding the index itself. But as luck would have it, just as indexing was gaining ground, a new wave of academic research was being published that weakened some of the results of the earlier research and thereby undercut part of the justification for indexing. It didn’t obviate all the reasons for indexing (indexing was still a low-cost way to create diversification for an entire fund or as part of an active/passive strategy), but it did tend to silence the index-because-you-can’t-do better school.
Fat Tails, Long Memory, and the Stock Market Since the 1960’s
1997The practice of risk management starts with an understanding of the statistical behavior of financial asset prices over time. Models such as the random walk hypothesis, the martingale model, and geometric Brownian motion are fundamental to any analysis of financial risks and rewards, particularly for longer investment horizons. Recent empirical evidence has cast doubt on some of these models, and this article provides an overview of such evidence. I begin with a review of the random walk hypothesis and related models, including a discussion of why such models perform so poorly, and then turn to some current research on alternative models such as long-term memory models and stable distributions.
A Non-Random Walk Down Wall Street
1997While financial economics is still in its infancy when compared to the mathematical and natural sciences, it has enjoyed a spectacular period of growth over the past three decades, thanks in part to the mathematical machinery that Wiener, Ito, and others pioneered. In this review article, I shall present a survey of some recent research in this exciting area—more specifically, in empirical finance and financial econometrics—including a discussion of the random walk hypothesis, long-term memory in stock market prices, performance evaluation, and the statistical estimation of diffusion processes. It is my hope that such a survey will serve both as a tribute to the amazing reach of Nobert Wiener's research, and as an enticement to those in the "hard" sciences to take on some of the challenges of modern finance.