A Survey of Systemic Risk Analytics2012
We provide a survey of 31 quantitative measures of systemic risk in the economics and finance literature, chosen to span key themes and issues in systemic risk measurement and management. We motivate these measures from the supervisory, research, and data perspectives in the main text, and present concise definitions of each risk measure--including required inputs, expected outputs, and data requirements--in an extensive appendix. To encourage experimentation and innovation among as broad an audience as possible, we have developed open-source Matlab code for most of the analytics surveyed, available for download above.
Commercializing Biomedical Research through Securitization Techniques2012
Biomedical innovation has become riskier, more expensive and more difficult to finance with traditional sources such as private and public equity. Here we propose a financial structure in which a large number of biomedical programs at various stages of development are funded by a single entity to substantially reduce the portfolio's risk. The portfolio entity can finance its activities by issuing debt, a critical advantage because a much large pool of capital is available for investment in debt versus equity. By employing financial engineering techniques such as securitization, it can raise even greater amounts of more-patient capital. In a simulation using historical data for new molecular entities in oncology from 1990 to 2011, we find that megafunds of $5-15 billion may yield average investment returns of 8.9-11.4% for equity holders and 5-8% for 'research-backed obligation' holders, which are lower than typical venture-capital hurdle rates by attractive to pension funds, insurance companies and other large institutional investors. Open-source software available for download in link above.
Do Labyrinthine Legal Limits on Leverage Lessen the Likelihood of Losses? An Analytical Framework2012
A common theme in the regulation of financial institutions and transactions is leverage constraints. Although such constraints are implemented in various ways—from minimum net capital rules to margin requirements to credit limits—the basic motivation is the same: to limit the potential losses of certain counterparties. However, the emergence of dynamic trading strategies, derivative securities, and other financial innovations poses new challenges to these constraints. We propose a simple analytical framework for specifying leverage constraints that addresses this challenge by explicitly linking the likelihood of financial loss to the behavior of the financial entity under supervision and prevailing market conditions. An immediate implication of this framework is that not all leverage is created equal, and any fixed numerical limit can lead to dramatically different loss probabilities over time and across assets and investment styles. This framework can also be used to investigate the macroprudential policy implications of microprudential regulations through the general-equilibrium impact of leverage constraints on market parameters such as volatility and tail probabilities.
Estimating the NIH Efficient Frontier2012
The National Institutes of Health (NIH) is among the world’s largest investors in biomedical research, with a mandate to: “…lengthen life, and reduce the burdens of illness and disability.” Its funding decisions have been criticized as insufficiently focused on disease burden. We hypothesize that modern portfolio theory can create a closer link between basic research and outcome, and offer insight into basic-science related improvements in public health. We propose portfolio theory as a systematic framework for making biomedical funding allocation decisions–one that is directly tied to the risk/reward trade-off of burden-of-disease outcomes.
Finance is in Need of a Technological Revolution2012
The financial system has reached a level of complexity that only “power users” – highly trained experts with domain-specific knowledge – are able to manage. But because technological advances have come so quickly and are often adopted so broadly, there are not enough power users to go around. The interconnectedness of financial markets and institutions has created a new form of financial accident: a systemic event that extends beyond the borders of any single organisation.
Adaptive Markets and the New World Order2012
In the Adaptive Markets Hypothesis (AMH) intelligent but fallible investors learn from and adapt to changing economic environments. This implies that markets are not always efficient, but are usually competitive and adaptive, varying in their degree of efficiency as the environment and investor population change over time. The AMH has several implications including the possibility of negative risk premia, alpha converging to beta, and the importance of macro factors and risk budgeting in asset-allocation policies.
An Evolutionary Model of Bounded Rationality and Intelligence2012
Most economic theories are based on the premise that individuals maximize their own self-interest and correctly incorporate the structure of their environment into all decisions, thanks to human intelligence. The influence of this paradigm goes far beyond academia–it underlies current macroeconomic and monetary policies, and is also an integral part of existing financial regulations. However, there is mounting empirical and experimental evidence, including the recent financial crisis, suggesting that humans do not always behave rationally, but often make seemingly random and suboptimal decisions.
Reading About the Financial Crisis: A Twenty-One-Book Review2012
The recent financial crisis has generated many distinct perspectives from various quarters. In this article, I review a diverse set of 21 books on the crisis, 11 written by academics, and 10 written by journalists and one former Treasury Secretary. No single narrative emerges from this broad and often contradictory collection of interpretations, but the sheer variety of conclusions is informative, and underscores the desperate need for the economics profession to establish a single set of facts from which more accurate inferences and narratives can be constructed.
Econometric Measures of Connectedness and Systemic Risk in the Finance and Insurance Sectors2012
A significant contributing factor to the Financial Crisis of 2007–2009 was the apparent interconnectedness among hedge funds, banks, brokers, and insurance companies, which amplified shocks into systemic events. In this paper, we propose five measures of systemic risk based on statistical relations among the market returns of these four types of financial institutions. Using correlations, cross-autocorrelations, principal components analysis, regime-switching models, and Granger causality tests, we find that all four sectors have become highly interrelated and less liquid over the past decade, increasing the level of systemic risk in the finance and insurance industries. These measures can also identify and quantify financial crisis periods. Our results suggest that while hedge funds can provide early indications of market dislocation, their contributions to systemic risk may not be as significant as those of banks, insurance companies, and brokers who take on risks more appropriate for hedge funds.
Privacy-Preserving Methods for Sharing Financial Risk Exposures2012
Unlike other industries in which intellectual property is patentable, the financial industry relies on trade secrecy to protect its business processes and methods, which can obscure critical financial risk exposures from regulators and the public. We develop methods for sharing and aggregating such risk exposures that protect the privacy of all parties involved and without the need for a trusted third party. Our approach employs secure multi-party computation techniques from cryptography in which multiple parties are able to compute joint functions without revealing their individual inputs. In our framework, individual financial institutions evaluate a protocol on their proprietary data which cannot be inverted, leading to secure computations of real-valued statistics such as concentration indexes, pairwise correlations, and other single- and multi-point statistics. The proposed protocols are computationally tractable on realistic sample sizes. Potential financial applications include: the construction of privacy-preserving real-time indexes of bank capital and leverage ratios; the monitoring of delegated portfolio investments; financial audits, and the publication of new indexes of proprietary trading strategies.
The FTSE StableRisk Indices2011
Implicit in most asset-allocation policies is the statistical assumption of “stationarity,” which means that the means, variances, and covariances of asset returns are assumed to be constant over time. This assumption is a reasonable approximation during normal market conditions but fails dramatically during periods of market turmoil and dislocation. In such periods, market volatility is highly dynamic, correlations can jump to 100% in a matter of days, and risk premia can become negative for months at a time. FTSE and AlphaSimplex Group have developed a family of rule-driven (passive), transparent, and high-capacity indices whose volatilities are rescaled as often as daily with the goal of maintaining more stable risk levels. By stabilizing the risk of each asset class over time, the FTSE StableRisk Indices have the potential to capture the long-term risk premia of asset classes and simple strategies with less severe maximum drawdowns than those of traditional indices, which have no risk controls.
Managing Real-Time Risks and Returns: The Thomson Reuters NewsScope Event Indices2011
As financial markets grow in size and complexity, risk management protocols must also evolve to address more challenging demands. One of the most difficult of these challenges is managing event risk, the risk posed by unanticipated news that causes major market moves over short time intervals. Often cited but rarely managed, event risk has been relegated to the domain of qualitative judgment and discretion because of its heterogeneity and velocity. In this chapter, we describe one initiative aimed at solving this problem. The Thomson Reuters NewsScope Event Indices Project is an integrated framework for incorporating real-time news from the Thomson Reuters NewsScope subscription service into systematic investment and risk management protocols. The framework consists of a set of real-time event indices—each one taking on numerical values between 0 and 100—designed to capture the occurrence of unusual events of a particular kind. Each index is constructed by applying disciplined pattern recognition algorithms to real-time news feeds, and validated using econometric methods applied to historical data.
A Computational View of Market Efficiency2011
We propose to study market efficiency from a computational viewpoint. Borrowing from theoretical computer science, we define a market to be efficient with respect to resources S (e.g., time, memory) if no strategy using resources S can make a profit. As a first step, we consider memory-m strategies whose action at time t depends only on the m previous observations at times t - m,...,t - 1. We introduce and study a simple model of market evolution, where strategies impact the market by their decision to buy or sell. We show that the effect of optimal strategies using memory m can lead to "market conditions" that were not present initially, such as (1) market bubbles and (2) the possibility for a strategy using memory m' > m to make a bigger profit than was initially possible. We suggest ours as a framework to rationalize the technological arms race of quantitative trading firms.
What Happened To The Quants In August 2007?: Evidence from Factors and Transactions Data2011
During the week of August 6, 2007, a number of quantitative long/short equity hedge funds experienced unprecedented losses. It has been hypothesized that a coordinated deleveraging of similarly constructed portfolios caused this temporary dislocation in the market. Using the simulated returns of long/short equity portfolios based on five specific valuation factors, we find evidence that the unwinding of these portfolios began in July 2007 and continued until the end of 2007. Using transactions data, we find that the simulated returns of a simple market-making strategy were significantly negative during the week of August 6, 2007, but positive before and after, suggesting that the Quant Meltdown of August 2007 was the combined effects of portfolio deleveraging throughout July and the first week of August, and a temporary withdrawal of market-making risk capital starting August 8th. Our simulations point to two unwinds—a mini-unwind on August 1st starting at 10:45am and ending at 11:30am, and a more sustained unwind starting at the open on August 6th and ending at 1:00pm—that began with stocks in the financial sector and long Book-to-Market and short Earnings Momentum. These conjectures have significant implications for the systemic risks posed by the hedge-fund industry.
The National Transportation Safety Board A Model for Systemic Risk Management2011
We propose the National Transportation Safety Board (NTSB) as a model organization for addressing systemic risk in industries and contexts other than transportation. When adopted by regulatory agencies and the transportation industry, the safety recommendations of the NTSB have been remarkably effective in reducing the number of fatalities in various modes of transportation since the NTSB’s inception in 1967 as an independent agency. The NTSB has no regulatory authority and is solely focused on conducting forensic investigations of transportation accidents and proposing safety recommendations. With only 400 full-time employees, the NTSB has a much larger network of experts drawn from other government agencies and the private sector who are on call to assist in accident investigations on an as-needed basis. By allowing the participation in its investigations of all interested parties who can provide technical assistance to the investigations, the NTSB produces definitive analyses of even the most complex accidents and provides actionable measures for reducing the chances of future accidents. It is possible to create more efficient and effective systemic-risk management processes in many other industries, including financial services, by studying the organizational structure and functions of the NTSB.