Spectral Analysis of Stock-Return Volatility, Correlation, and Beta
with Shomesh Chaudhuri, Signal Processing and Signal Processing Education Workshop (SP/SPE), 2015 IEEE, 232–236.
We apply spectral techniques to analyze the volatility and correlation of U.S. common-stock returns across multiple time horizons at the aggregate-market and individual-firm level. Using the cross-periodogram to construct frequency bandlimited measures of variance, correlation and beta, we find that volatilities and correlations change not only in magnitude over time, but also in frequency. Factors that may be responsible for these trends are proposed and their implications for portfolio construction are explored.
Health, Wealth, and the 21st Century Cures Act
with Tomas Philipson and Andrew von Eschenbach, JAMA Oncology 2(2016), 17–18.
Americans are increasingly apprehensive about our future, so it is inspiring when Congress produces legislation intended to both enhance our health and expand our economy. The 21st Century Cures Act, recently passed by the House with an impressive bipartisan majority vote of 344 to 77, intends to accelerate the many-step process of drug discovery and development, from basic scientific research to clinical development to delivery, distribution, and ongoing monitoring. Among other things, the legislation boosts National Institute of Health funding, dramatically speeds up the US Food and Drug Administration (FDA) approval process, and aims to make use of new information technology to better monitor the performance of medical products after they reach the market. This landmark bill now awaits a comparable piece of legislation being developed by the Senate Health Education, Labor, and Pensions Committee. Together, they will transform the biomedical ecosystem and provide the foundation for the next several decades of innovative life-saving and health-enhancing solutions for our nation and the world.
TRC Networks and Systemic Risk
with Roger Stein, Journal of Alternative Investment 18(2016), 52–67.
The authors introduce a new approach to identifying and monitoring systemic risk that combines network analysis and tail risk contribution (TRC). Network analysis provides great flexibility in representing and exploring linkages between institutions, but it can be overly general in describing the risk exposures of one entity to another. TRC provides a more focused view of key systemic risks and richer financial intuition, but it may miss important linkages between financial institutions. Integrating these two methods can provide information on key relationships between institutions that may become relevant during periods of systemic stress. The authors demonstrate this approach using the exposures of money market funds to major financial institutions during July 2011. The results for their example suggest that TRC networks can highlight both institutions and funds that may become distressed during a financial crisis.
The Wisdom of Twitter Crowds: Predicting Stock Market Reactions to FOMC Meetings via Twitter Feeds
with Pablo D. Azar, Journal of Portfolio Management 42(2016), 123–134.
With the rise of social media, investors have a new tool for measuring sentiment in real time. However, the nature of these data sources raises serious questions about its quality. Because anyone on social media can participate in a conversation about markets—whether the individual is informed or not—these data may have very little information about future asset prices. In this article, the authors show that this is not the case. They analyze a recurring event that has a high impact on asset prices—Federal Open Market Committee (FOMC) meetings—and exploit a new dataset of tweets referencing the Federal Reserve. The authors show that the content of tweets can be used to predict future returns, even after controlling for common asset pricing factors. To gauge the economic magnitude of these predictions, the authors construct a simple hypothetical trading strategy based on this data. They find that a tweet-based asset allocation strategy outperforms several benchmarks—including a strategy that buys and holds a market index, as well as a comparable dynamic asset allocation strategy that does not use Twitter information.
Q Group Panel Discussion: Looking to the Future
with Martin Leibowitz, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel, Financial Analysts Journal
Moderator Martin Leibowitz asked a panel of industry experts—Andrew W. Lo, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel—what they saw as the most important issues in finance, especially as those issues relate to practitioners. Drawing on their vast knowledge, these panelists addressed topics such as regulation, technology, and financing society’s challenges; opacity and trust; the social value of finance; and future expected returns.
Buying Cures Versus Renting Health: Financing Health Care with Consumer Loans
with Vahid Montazerhodjat and David M. Weinstock, Science Translational Medicine 8(2016), 327ps6.
A crisis is building over the prices of new transformative therapies for cancer, hepatitis C virus infection, and rare diseases. The clinical imperative is to offer these therapies as broadly and rapidly as possible. We propose a practical way to increase drug affordability through health care loans (HCLs)—the equivalent of mortgages for large health care expenses. HCLs allow patients in both multipayer and single-payer markets to access a broader set of therapeutics, including expensive short-duration treatments that are curative. HCLs also link payment to clinical benefit and should help lower per-patient cost while incentivizing the development of transformative therapies rather than those that offer small incremental advances. Moreover, we propose the use of securitization—a well-known financial engineering method—to finance a large diversified pool of HCLs through both debt and equity. Numerical simulations suggest that securitization is viable for a wide range of economic environments and cost parameters, allowing a much broader patient population to access transformative therapies while also aligning the interests of patients, payers, and the pharmaceutical
Financing Drug Discovery via Dynamic Leverage
with Vahid Montazerhodjat and John J. Frishkopf, Drug Discovery Today 21(2016), 410–414.
We extend the megafund concept for funding drug discovery to enable dynamic leverage in which the portfolio of candidate therapeutic assets is predominantly financed initially by equity, and debt is introduced gradually as assets mature and begin generating cash flows. Leverage is adjusted so as to maintain an approximately constant level of default risk throughout the life of the fund. Numerical simulations show that applying dynamic leverage to a small portfolio of orphan drug candidates can boost the return on equity almost twofold compared with securitization with a static capital structure. Dynamic leverage can also add significant value to comparable all-equity-financed portfolios, enhancing the return on equity without jeopardizing debt performance or increasing risk to equity investors.
What Is An Index?
Journal of Portfolio Management 42(2016), 21–36.
Technological advances in telecommunications, securities exchanges, and algorithmic trading have facilitated a host of new investment products that resemble theme-based passive indexes but which depart from traditional market-cap-weighted portfolios. I propose broadening the definition of an index using a functional perspective—any portfolio strategy that satisfies three properties should be considered an index: (1) it is completely transparent; (2) it is investable; and (3) it is systematic, i.e., it is entirely rules-based and contains no judgment or unique investment skill. Portfolios satisfying these properties that are not market-cap-weighted are given a new name: “dynamic indexes.” This functional definition widens the universe of possibilities and, most importantly, decouples risk management from alpha generation. Passive strategies can and should be actively risk managed, and I provide a simple example of how this can be achieved. Dynamic indexes also create new challenges of which the most significant is backtest bias, and I conclude with a proposal for managing this risk.
Law Is Code: A Software Engineering Approach to Analyzing the United States Code
with William Li, Pablo Azar, David Larochelle, Phil Hill, Journal of Business & Technology Law 10(2015), 297–374.
The agglomeration of rules and regulations over time has produced a body of legal code that no single individual can fully comprehend. This complexity produces inefficiencies, makes the processes of understanding and changing the law difficult,and frustrates the fundamental principle that the law should provide fair notice to the governed. In this Article, we take a quantitative, unbiased, and software-engineering approach to analyze the evolution of the United States Code from 1926 to today. Software engineers frequently face the challenge of understanding and managing large, structured collections of instructions, directives, and conditional statements, and we adapt and apply their techniques to the U.S. Code over time. Our work produces insights into the structure of the U.S. Code as a whole, its strengths and vulnerabilities, and new ways of thinking about individual laws. For example, we identify the first appearance and spread of important terms in the U.S. Code like “whistleblower” and “privacy.” We also analyze and visualize the network structure of certain substantial reforms,including the Patient Protection and Affordable Care Act and the Dodd-Frank Wall Street Reform and Consumer Protection Act, and show how the interconnections of references can increase complexity and create the potential for unintended consequences. Our work is a timely illustration of computational approaches to law as the legal profession embraces technology for scholarship in order to increase efficiency and to improve access to justice.
Reply to “(Im)Possible Frontiers: A Comment”
with Thomas J. Brennan, Critical Finance Review 4(2015), 157–171.
In Brennan and Lo (2010), a mean-variance efficient frontier is defined as “impossible” if every portfolio on that frontier has negative weights, which is incompatible with the Capital Asset Pricing Model (CAPM) requirement that the market portfolio is mean-variance efficient. We prove that as the number of assets n grows, the probability that a randomly chosen frontier is impossible tends to one at a geometric rate, implying that the set of parameters for which the CAPM holds is extremely rare. Levy and Roll (2014) argue that while such “possible”frontiers are rare, they are ubiquitous. In this reply, we show that this is not the case; possible frontiers are not only rare,but they occupy an isolated region of mean-variance parameter space that becomes increasingly remote as n increases. Ingersoll (2014) observes that parameter restrictions can rule out impossible frontiers, but in many cases these restrictions contradict empirical fact and must be questioned rather than blindly imposed.