It’s 11pm—Do You Know Where Your Liquidity Is? The Mean-Variance-Liquidity Frontier
with Constantin Petrov and Martin Wierzbicki, Journal of Investment Management 1 (2003), 55–93.
We introduce liquidity into the standard mean-variance portfolio optimization framework by defining several measures of liquidity and then constructing three-dimensional mean-variance-liquidity frontiers in three ways—liquidity filtering, liquidity constraints, and a mean-variance-liquidity objective function. We show that portfolios close to each other on the traditional mean-variance efficient frontier can differ substantially in their liquidity characteristics. In a simple empirical example, the liquidity exposure of mean-variance efficient portfolios change dramatically from month to month, and even simple forms of liquidity optimization can yield significant benefits in reducing a portfolio's liquidity-risk exposure without sacrificing a great deal of expected return per unit risk.
Bubble, Rubble, Finance In Trouble?
Journal of Psychology and Financial Markets 3 (2002), 76–86.
In this talk, I review the implications of the recent rise and fall of the technology sector for traditional financial theories and their behavioral alternatives. Although critics of the Efficient Markets Hypothesis argue that markets are driven by fear and greed, not fundamentals, recent research in the cognitive neurosciences suggest that these two perspectives are opposite sides of the same coin. I propose a new paradigm for financial economics that focuses more on the evolutionary biology and ecology of markets rather than the more traditional physicists' view. By marrying the principles of evolution to Herbert Simon's notion of "satisficing,'' I argue that much of what behavioralists cite as counter-examples to economic rationality—loss aversion, overconfidence, overreaction, mental accounting, and other behavioral biases—are, in fact, consistent with an evolutionary model of rational agents learning to adapt to their environment via satisficing heuristics.
The Psychophysiology of Real-Time Financial Risk Processing
with Dmitry V. Repin, Journal of Cognitive Neuroscience 14 (2002), 323–339.
A longstanding controversy in economics and finance is whether financial markets are governed by rational forces or by emotional responses. We study the importance of emotion in the decisionmaking process of professional securities traders by measuring their physiological characteristics, e.g., skin conductance, blood volume pulse, etc., during live trading sessions while simultaneously capturing real-time prices from which market events can be detected. In a sample of 10 traders, we find significant correlation between electrodermal responses and transient market events, and between changes in cardiovascular variables and market volatility. We also observe differences in these correlations among the 10 traders which may be systematically related to the traders' levels of experience.
Econometric Models of Limit-Order Executions
with Craig MacKinlay and June Zhang, Journal of Financial Economics 65 (2002), 31–71.
Limit orders incur no price impact, however, their execution time is uncertain. We develop an econometric model of limit-order execution times using survival analysis, and estimate it with actual limit-order data. We estimate versions for time-to-first-fill and time-to-completion, and for limit-sells and limit-buys, and incorporate the effects of explanatory variables such as the limit price, the limit size, the bid/offer spread, and market volatility. We find that execution times are very sensitive to limit price and several other explanatory variables, but not sensitive to limit size. We also show that hypothetical limit-order executions, constructed either theoretically from first-passage times or empirically from transactions data, are very poor proxies for actual limit-order executions.
The Sources and Nature of Long-Term Dependence in the Business Cycle
with Joseph Haubrich, Federal Reserve Bank of Cleveland Economic Review 37 (2001), 15–30.
This paper examines the stochastic properties of aggregate macroeconomic time series from the standpoint of fractionally integrated models, and focuses on the persistence of economic shocks. We develop a simple macroeconomic model that exhibits long-term dependence, a consequence of aggregation in the presence of real business cycles. We derive the relation between properties of fractionally integrated macroeconomic time series and those of microeconomic data, and discuss how fiscal policy may alter their stochastic behavior. To implement these results empirically, we employ a test for fractionally integrated time series based on the Hurst-Mandelbrot rescaled range. This test is robust to short-term dependence, and is applied to quarterly and annual real GNP to determine the sources and nature of long-term dependence in the business cycle.
Hedging Derivative Securities and Incomplete Markets: An Epsilon-Arbitrage Approach
with Dimitris Bertsimas and Leonid Kogan, Operations Research 49 (2001), 372–397.
Given a European derivative security with an arbitrary payoff function and a corresponding set of underlying securities on which the derivative security is based, we solve the dynamic replication problem: find a self-financing dynamic portfolio strategy—involving only the underlying securities—that most closely approximates the payoff function at maturity. By applying stochastic dynamic programming to the minimization of a mean-squared-error loss function under Markov state-dynamics, we derive recursive expressions for the optimal-replication strategy that are readily implemented in practice. The approximation error or "epsilon" of the optimal-replication strategy is also given recursively and may be used to quantify the "degree" of market incompleteness. To investigate the practical significance of these epsilon-arbitrage strategies, we consider several numerical examples including path-dependent options and options on assets with stochastic volatility and jumps.
Computational Challenges in Portfolio Management
with Martin Haugh, Computing in Science & Engineering 3 (2001), 54–59.
The financial industry is one of the fastest-growing areas of scientific computing. Two decades ago, terms such as financial engineering, computational finance, and financial mathematics did not exist in common usage. Today, these areas are distinct and enormously popular academic disciplines with their own journals, conferences, and professional societies. One explanation for this area’s remarkable growth and the impressive array of mathematicians, computer scientists, physicists, and economists that are drawn to it is the formidable intellectual challenges intrinsic to financial markets. Many of the most basic problems in financial analysis are unsolved and surprisingly resilient to the onslaught of researchers from diverse disciplines. In this article, we hope to give a sense of these challenges by describing a relatively simple problem that all investors face when managing a portfolio of financial securities over time. Such a problem becomes more complex once real-world considerations factor into its formulation. We present the basic dynamic portfolio optimization problem and then consider three aspects of it: taxes, investor preferences, and portfolio constraints. These three issues are by no means exhaustive—they merely illustrate examples of the kinds of challenges financial engineers face today. Examples of other computational issues in portfolio optimization appear elsewhere.
Asset Allocation and Derivatives
with Martin Haugh, Quantitative Finance 1 (2001), 45–72.
The fact that derivative securities are equivalent to specific dynamic trading strategies in complete markets suggests the possibility of constructing buy-and-hold portfolios of options that mimic certain dynamic investment policies, e.g., asset-allocation rules. We explore this possibility by solving the following problem: given an optimal dynamic investment policy, find a set of options at the start of the investment horizon which will come closest to the optimal dynamic investment policy. We solve this problem for several combinations of preferences, return dynamics, and optimality criteria, and show that under certain conditions, a portfolio consisting of just a few options is an excellent substitute for considerably more complex dynamic investment policies.
Foundations of Technical Analysis: Computational Algorithms, Statistical Inference, and Empirical Implementation
with Harry Mamaysky and Jiang Wang, Journal of Finance 55 (2000), 1705–1765.
Technical analysis, also known as "charting,'' has been a part of financial practice for many decades, but this discipline has not received the same level of academic scrutiny and acceptance as more traditional approaches such as fundamental analysis. One of the main obstacles is the highly subjective nature of technical analysis—the presence of geometric shapes in historical price charts is often in the eyes of the beholder. In this paper, we propose a systematic and automatic approach to technical pattern recognition using nonparametric kernel regression, and apply this method to a large number of U.S. stocks from 1962 to 1996 to evaluate the effectiveness of technical analysis. By comparing the unconditional empirical distribution of daily stock returns to the conditional distribution—conditioned on specific technical indicators such as head-and-shoulders or double-bottoms—we find that over the 31-year sample period, several technical indicators do provide incremental information and may have some practical value.
Finance: A Selective Survey
Journal of the American Statistical Association 95 (2000), 629-635.
Ever since the publication in 1565 of Girolamo Cardano's treatise on gambling, Liber de Ludo Aleae (The Book of Games of Chance), statistics and financial markets have become inextricably linked. Over the past few decades many of these links have become part of the canon of modern finance, and it is now impossible to fully appreciate the workings of financial markets without them. This selective survey covers three of the most important ideas of finance—efficient markets, the random walk hypothesis, and derivative pricing models—that illustrate the enormous research opportunities that lie at the intersection of finance and statistics.