Publications
Computational Challenges in Portfolio Management
2001The financial industry is one of the fastest-growing areas of scientific computing. Two decades ago, terms such as financial engineering, computational finance, and financial mathematics did not exist in common usage. Today, these areas are distinct and enormously popular academic disciplines with their own journals, conferences, and professional societies. One explanation for this area’s remarkable growth and the impressive array of mathematicians, computer scientists, physicists, and economists that are drawn to it is the formidable intellectual challenges intrinsic to financial markets. Many of the most basic problems in financial analysis are unsolved and surprisingly resilient to the onslaught of researchers from diverse disciplines. In this article, we hope to give a sense of these challenges by describing a relatively simple problem that all investors face when managing a portfolio of financial securities over time. Such a problem becomes more complex once real-world considerations factor into its formulation. We present the basic dynamic portfolio optimization problem and then consider three aspects of it: taxes, investor preferences, and portfolio constraints. These three issues are by no means exhaustive—they merely illustrate examples of the kinds of challenges financial engineers face today. Examples of other computational issues in portfolio optimization appear elsewhere.
Finance: A Selective Survey
2001Ever since the publication in 1565 of Girolamo Cardano's treatise on gambling, Liber de Ludo Aleae (The Book of Games of Chance), statistics and financial markets have become inextricably linked. Over the past few decades many of these links have become part of the canon of modern finance, and it is now impossible to fully appreciate the workings of financial markets without them. This selective survey covers three of the most important ideas of finance—efficient markets, the random walk hypothesis, and derivative pricing models—that illustrate the enormous research opportunities that lie at the intersection of finance and statistics.
Computational Finance 1999
2000
This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation.
Computational finance, an exciting new cross-disciplinary research area, draws extensively on the tools and techniques of computer science, statistics, information systems, and financial economics. This book covers the techniques of data mining, knowledge discovery, genetic algorithms, neural networks, bootstrapping, machine learning, and Monte Carlo simulation. These methods are applied to a wide range of problems in finance, including risk management, asset allocation, style analysis, dynamic trading and hedging, forecasting, and option pricing. The book is based on the sixth annual international conference Computational Finance 1999, held at New York University's Stern School of Business.
Nonparametric Risk Management and Implied Risk Aversion
2000Typical value-at-risk (VAR) calculations involve the probabilities of extreme dollar losses, based on the statistical distributions of market prices. Such quantities do not account for the fact that the same dollar loss can have two very different economic valuations, depending on business conditions. We propose a nonparametric VAR measure that incorporates economic valuation according to the state-price density associated with the underlying price processes. The state-price density yields VAR values that are adjusted for risk aversion, time preferences, and other variations in economic valuation. In the context of a representative agent equilibrium model, we construct an estimator of the risk-aversion coefficient that is implied by the joint observations on option prices and underlying asset value.
Optimal Control of Execution Costs for Portfolios
2000The dramatic growth in institutionally managed assets, coupled with the advent of internet trading and electronic brokerage for retail investors, has led to a surge in the size and volume of trading. At the same time, competition in the asset management industry has increased to the point where fractions of a percent in performance can separate the top funds from those in the next tier. In this environment, portfolio managers have begun to explore active management of trading costs as a means of boosting returns. Controlling execution cost can be viewed as a stochastic dynamic optimization problem because trading takes time, stock prices exhibit random fluctuations, and execution prices depend on trade size, order flow, and market conditions. In this paper, we apply stochastic dynamic programming to derive trading strategies that minimize the expected cost of executing a portfolio of securities over a fixed period of time, i.e., we derive the optimal sequence of trades as a function of prices, quantitites, and other market conditions. To illustrate the practical relevance of our methods, we apply them to a hypothetical portfolio of 25 stocks by estimating their price-impact functions using historical trade data from 1996 and deriving the optimal execution strategies. We also perform several Monte Carlo simulation experiments to compare the performance of the optimal strategy to several alternatives.
Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory
2000We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. We begin by showing that a two-fund separation theorem suggests a natural definition for trading volume: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that share turnover satisfies and approximate linear K-factor structure, These implications are empirically tested using weekly turnover data for NYSE and AMEX securities from 1962 to 1996. We find strong evidence against two-fund separation and an eigenvalue decomposition suggests that volume is driven by a two-factor linear model. Click here to download Trading Volume and the MiniCRSP Database: An Introduction and User’s Guide for instructions on how to create your own MiniCRSP database.
When Is Time Continuous?
2000In this paper we study the tracking error resulting from the discrete-time application of continuous-time delta-hedging procedures for European options. We characterize the asymptotic distribution of the tracking error as the number of discrete time periods increases, and its joint distribution with other assets. We introduce the notion of temporal granularity of the continuous-time stochastic model that enables us to characterize the degree to which discrete-time approximations of continuous time models track the payoff of the option. We derive closed form expressions for the granularity for a put and call option on a stock that follows a geometric Brownian motion and a mean-reverting process. These expressions offer insight into the tracking error involved in applying continuous-time delta-hedging in discrete time. We also introduce alternative measures of the tracking error and analyze their properties.
Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation
2000Technical analysis, also known as "charting,'' has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis—the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution—conditioned on specific technical indicators such as head-and-shoulders or double-bottoms—we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
A Non-Random Walk Down Wall Street
1999For over half a century, financial experts have regarded the movements of markets as a random walk--unpredictable meanderings akin to a drunkard's unsteady gait--and this hypothesis has become a cornerstone of modern financial economics and many investment strategies. Here Andrew W. Lo and A. Craig MacKinlay put the Random Walk Hypothesis to the test. In this volume, which elegantly integrates their most important articles, Lo and MacKinlay find that markets are not completely random after all, and that predictable components do exist in recent stock and bond returns. Their book provides a state-of-the-art account of the techniques for detecting predictabilities and evaluating their statistical and economic significance, and offers a tantalizing glimpse into the financial technologies of the future.
The Three P’s of Total Risk Management
1999Current risk-management practices are based on probabilities of extreme dollar losses (e.g., measures like Value at Risk), but these measures capture only part of the story. Any complete risk-management system must address two other important factors: prices and preferences. Together with probabilities, these comprise the three P's of Total Risk Management. This article describes how the three Ps interact to determine sensible risk profiles for corporations and for individuals, guidelines for how much risk to bear and how much to hedge. By synthesizing existing research in economics, psychology, and decision sciences, and through an ambitious research agenda to extend this synthesis into other disciplines, a complete and systematic approach to rational decision-making in an uncertain world is within reach.
Frontiers of Finance: Evolution and Efficient Markets
1999In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the "law" of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun.
Trading Volume and the MiniCRSP Database: An Introduction and User’s Guide
1998This guide provides details on how to access the MiniCRSP database and reports the results of some exploratory data analysis of trading volume. MiniCRSP contains daily as well as weekly-aggregated data derived from the CRSP Stocks daily file. MiniCRSP comprises returns, turnover, and other data items of research interest, at daily and weekly frequencies, stored in a format such that storage space and access times are minimized. A set of access routines is provided to enable the data to be read via either sequential and random access methods on almost any machine platform.
Optimal Control of Execution Costs
1998We derive dynamic optimal trading strategies that minimize the expected cost of trading a large block of equity over a fixed time horizon. Specifically, given a fixed block S of shares to be executed within a fixed finite number of periods T, and given a price-impact function that yields the execution price of an individual trade as a function of the shares traded and market conditions, we obtain the optimal sequence of trades as a function of market conditions—closed-form expressions in some cases—that minimizes the expected cost of executing S within T periods. Our analysis is extended to the portfolio case in which price impact across stocks can have an important effect on the total cost of trading a portfolio.
Nonparametric Estimation of State-Price Densities Implicit In Financial Asset Prices
1998Implicit in the prices of traded financial assets are Arrow-Debreu state prices or, in the continuous-state case, the state-price density [SPD]. We construct an estimator for the SPD implicit in option prices and derive an asymptotic sampling theory for this estimator to gauge its accuracy. The SPD estimator provides an arbitrage-free method of pricing new, more complex, or less liquid securities while capturing those features of the data that are most relevant from an asset-pricing perspective, e.g., negative skewness and excess kurtosis for asset returns, volatility "smiles" for option prices. We perform Monte Carlo simulation experiments to show that the SPD estimator can be successfully extracted from option prices and we present an empirical application using S&P 500 index options.
The Econometrics of Financial Markets
1997The past twenty years have seen an extraordinary growth in the use of quantitative methods in financial markets. Finance professionals now routinely use sophisticated statistical techniques in portfolio management, proprietary trading, risk management, financial consulting, and securities regulation. This graduate-level textbook is intended for PhD students, advanced MBA students, and industry professionals interested in the econometrics of financial modeling. The book covers the entire spectrum of empirical finance, including: the predictability of asset returns, tests of the Random Walk Hypothesis, the microstructure of securities markets, event analysis, the Capital Asset Pricing Model and the Arbitrage Pricing Theory, the term structure of interest rates, dynamic models of economic equilibrium, and nonlinear financial models such as ARCH, neural networks, statistical fractals, and chaos theory.