Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory2000
We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. We begin by showing that a two-fund separation theorem suggests a natural definition for trading volume: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that share turnover satisfies and approximate linear K-factor structure, These implications are empirically tested using weekly turnover data for NYSE and AMEX securities from 1962 to 1996. We find strong evidence against two-fund separation and an eigenvalue decomposition suggests that volume is driven by a two-factor linear model.
For instructions on how to create your own MiniCRSP database, please see Trading Volume and the MiniCRSP Database: An Introduction and User’s Guide.
When Is Time Continuous?2000
In this paper we study the tracking error resulting from the discrete-time application of continuous-time delta-hedging procedures for European options. We characterize the asymptotic distribution of the tracking error as the number of discrete time periods increases, and its joint distribution with other assets. We introduce the notion of temporal granularity of the continuous-time stochastic model that enables us to characterize the degree to which discrete-time approximations of continuous time models track the payoff of the option. We derive closed form expressions for the granularity for a put and call option on a stock that follows a geometric Brownian motion and a mean-reverting process. These expressions offer insight into the tracking error involved in applying continuous-time delta-hedging in discrete time. We also introduce alternative measures of the tracking error and analyze their properties.
Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation2000
Technical analysis, also known as "charting,'' has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis—the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution—conditioned on specific technical indicators such as head-and-shoulders or double-bottoms—we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
A Non-Random Walk Down Wall Street1999
For over half a century, financial experts have regarded the movements of markets as a random walk--unpredictable meanderings akin to a drunkard's unsteady gait--and this hypothesis has become a cornerstone of modern financial economics and many investment strategies. Here Andrew W. Lo and A. Craig MacKinlay put the Random Walk Hypothesis to the test. In this volume, which elegantly integrates their most important articles, Lo and MacKinlay find that markets are not completely random after all, and that predictable components do exist in recent stock and bond returns. Their book provides a state-of-the-art account of the techniques for detecting predictabilities and evaluating their statistical and economic significance, and offers a tantalizing glimpse into the financial technologies of the future.
The Three P’s of Total Risk Management1999
Current risk-management practices are based on probabilities of extreme dollar losses (e.g., measures like Value at Risk), but these measures capture only part of the story. Any complete risk-management system must address two other important factors: prices and preferences. Together with probabilities, these comprise the three P's of Total Risk Management. This article describes how the three Ps interact to determine sensible risk profiles for corporations and for individuals, guidelines for how much risk to bear and how much to hedge. By synthesizing existing research in economics, psychology, and decision sciences, and through an ambitious research agenda to extend this synthesis into other disciplines, a complete and systematic approach to rational decision-making in an uncertain world is within reach.
Frontiers of Finance: Evolution and Efficient Markets1999
In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the "law" of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun.
Optimal Control of Execution Costs1998
We derive dynamic optimal trading strategies that minimize the expected cost of trading a large block of equity over a fixed time horizon. Specifically, given a fixed block S of shares to be executed within a fixed finite number of periods T, and given a price-impact function that yields the execution price of an individual trade as a function of the shares traded and market conditions, we obtain the optimal sequence of trades as a function of market conditions—closed-form expressions in some cases—that minimizes the expected cost of executing S within T periods. Our analysis is extended to the portfolio case in which price impact across stocks can have an important effect on the total cost of trading a portfolio.
Nonparametric Estimation of State-Price Densities Implicit In Financial Asset Prices1998
Implicit in the prices of traded financial assets are Arrow-Debreu state prices or, in the continuous-state case, the state-price density [SPD]. We construct an estimator for the SPD implicit in option prices and derive an asymptotic sampling theory for this estimator to gauge its accuracy. The SPD estimator provides an arbitrage-free method of pricing new, more complex, or less liquid securities while capturing those features of the data that are most relevant from an asset-pricing perspective, e.g., negative skewness and excess kurtosis for asset returns, volatility "smiles" for option prices. We perform Monte Carlo simulation experiments to show that the SPD estimator can be successfully extracted from option prices and we present an empirical application using S&P 500 index options.
The Econometrics of Financial Markets1997
The past twenty years have seen an extraordinary growth in the use of quantitative methods in financial markets. Finance professionals now routinely use sophisticated statistical techniques in portfolio management, proprietary trading, risk management, financial consulting, and securities regulation. This graduate-level textbook is intended for PhD students, advanced MBA students, and industry professionals interested in the econometrics of financial modeling. The book covers the entire spectrum of empirical finance, including: the predictability of asset returns, tests of the Random Walk Hypothesis, the microstructure of securities markets, event analysis, the Capital Asset Pricing Model and the Arbitrage Pricing Theory, the term structure of interest rates, dynamic models of economic equilibrium, and nonlinear financial models such as ARCH, neural networks, statistical fractals, and chaos theory.
Market Efficiency: Stock Market Behaviour In Theory and Practice, Volumes I & II1997
The efficient markets hypothesis is one of the most controversial and hotly contested ideas in all the social sciences. It is disarmingly simply to state, has far-reaching consequences for academic pursuits and business practice, and yet is surprisingly resilient to empirical proof or refutation. Even after three decades of research and literally thousands of journal articles, economists have not yet reached a consensus about whether markets - particularly financial markets - are efficient or not.
A Non-Random Walk Down Wall Street: Recent Advances in Financial Technology1997
In the ’50s and ’60s, just as the era of the professional portfolio manager was dawning, financial economists were telling anyone who would listen that active management was probably a big mistake—a waste of time and money. Their research demonstrated that historical prices were of little use in helping to predict where future prices would go. Prices simply took a “random walk.” The better part of wisdom, they advised, was to be a passive investor. At first, not too many of the people who influence the way money is managed (those who select managers of large portfolios) listened. But as time went on, it became apparent that they should have. Because of fees and turnover, the managers they picked typically underperformed the market. And the worse an active manager did relative to a market index, the more attractive seemed the low cost alternative of buying and holding the index itself. But as luck would have it, just as indexing was gaining ground, a new wave of academic research was being published that weakened some of the results of the earlier research and thereby undercut part of the justification for indexing. It didn’t obviate all the reasons for indexing (indexing was still a low-cost way to create diversification for an entire fund or as part of an active/passive strategy), but it did tend to silence the index-because-you-can’t-do better school.
Fat Tails, Long Memory, and the Stock Market Since the 1960’s1997
The practice of risk management starts with an understanding of the statistical behavior of financial asset prices over time. Models such as the random walk hypothesis, the martingale model, and geometric Brownian motion are fundamental to any analysis of financial risks and rewards, particularly for longer investment horizons. Recent empirical evidence has cast doubt on some of these models, and this article provides an overview of such evidence. I begin with a review of the random walk hypothesis and related models, including a discussion of why such models perform so poorly, and then turn to some current research on alternative models such as long-term memory models and stable distributions.
A Non-Random Walk Down Wall Street1997
While financial economics is still in its infancy when compared to the mathematical and natural sciences, it has enjoyed a spectacular period of growth over the past three decades, thanks in part to the mathematical machinery that Wiener, Ito, and others pioneered. In this review article, I shall present a survey of some recent research in this exciting area—more specifically, in empirical finance and financial econometrics—including a discussion of the random walk hypothesis, long-term memory in stock market prices, performance evaluation, and the statistical estimation of diffusion processes. It is my hope that such a survey will serve both as a tribute to the amazing reach of Nobert Wiener's research, and as an enticement to those in the "hard" sciences to take on some of the challenges of modern finance.
Maximizing Predictability in the Stock and Bond Markets1997
We construct portfolios of stocks and of bonds that are maximally predictable with respect to a set of ex ante observable economic variables, and show that these levels of predictability are statistically significant, even after controlling for data-snooping biases. We disaggregate the sources for predictability by using several asset groups—sector portfolios, market-capitalization portfolios, and stock/bond/utility portfolios—and find that the sources of maximal predictability shift considerably across asset classes and sectors as the return-horizon changes. Using three out-of-sample measures of predictability—forecast errors, Merton's market-timing measure, and the profitability of asset allocation strategies based on maximizing predictability—we show that the predictability of the maximally predictable portfolio is genuine and economically significant.
The Industrial Organization and Regulation of the Securities Industry1996
The regulation of financial markets has for years been the domain of lawyers, legislators, and lobbyists. In this unique volume, experts in industrial organization, finance, and law, as well as members of regulatory agencies and the securities industry, examine the securities industry from an economic viewpoint.
Ten original essays address topics including electronic trading and the "virtual"stock exchange; trading costs and liquidity on the London and Tokyo Stock Exchanges and in the German and Japanese government bond markets; international coordination among regulatory agencies; and the impact of changing margin requirements on stock prices, volatility, and liquidity.
This clear presentation of groundbreaking research will appeal to economists, lawyers, and legislators who seek a refreshingly new perspective on policy issues in the securities industry.