Research
Kim, Esther S., and Andrew W. Lo (2016), Business Models to Cure Rare Disease: A Case Study of Solid Biosciences, Journal of Investment Management 14 (4), 87–101.
View abstract
Hide abstract
Duchenne muscular dystrophy (DMD) is a rare genetic disorder affecting thousands of individuals, mainly young males, worldwide. Currently, the disease has no cure, and is fatal in all cases. Advances in our understanding of the disease and innovations in basic science have recently allowed biotechnology companies to pursue promising treatment candidates for the disease, but so far, only one drug with limited application has achieved FDA approval. In this case study, we profile the work of an early-stage life sciences company, Solid Biosciences, founded by a father of a young boy with DMD. In particular, we discuss Solid’s one-disease focus and its strategy to treat the disease with a diversified portfolio of approaches. The company is currently building a product pipeline consisting of genetic interventions, small molecules and biologics, and assistive devices, each aimed at addressing a different aspect of DMD. We highlight the potential for Solid’s business model and portfolio to achieve breakthrough treatments for the DMD patient community.
Cao, Charles, Bing Liang, Andrew W. Lo, and Lubomir Petrasek (2018), Hedge Fund Holdings and Stock Market Efficiency, Review of Asset Pricing Studies 8 (1), 77–116.
View abstract
Hide abstract
We examine the relation between changes in hedge fund equity holdings and measures of informational efficiency of stock prices derived from intraday transactions as well as daily data. On average, hedge fund ownership of stocks leads to greater improvements in price efficiency than mutual fund or bank ownership, especially for stocks held by hedge funds with high portfolio turnover and superior security selection skills. However, stocks held by hedge funds experienced large declines in price efficiency in the last quarter of 2008, particularly if the funds were connected to Lehman Brothers as a prime broker and used leverage in combination with lenient redemption terms.
Spectral Analysis of Stock-Return Volatility, Correlation, and Beta
Chaudhuri, Shomesh E., and Andrew W. Lo (2015), Spectral Analysis of Stock-Return Volatility, Correlation, and Beta, 2015 IEEE Signal Processing and Signal Processing Education Workshop (SP/SPE), 232–236.
View abstract
Hide abstract
We apply spectral techniques to analyze the volatility and correlation of U.S. common-stock returns across multiple time horizons at the aggregate-market and individual-firm level. Using the cross-periodogram to construct frequency bandlimited measures of variance, correlation and beta, we find that volatilities and correlations change not only in magnitude over time, but also in frequency. Factors that may be responsible for these trends are proposed and their implications for portfolio construction are explored.
Lo, Andrew W., and Roger M. Stein (2016), TRC Networks and Systemic Risk, Journal of Alternative Investments 18 (4), 52–67.
View abstract
Hide abstract
The authors introduce a new approach to identifying and monitoring systemic risk that combines network analysis and tail risk contribution (TRC). Network analysis provides great flexibility in representing and exploring linkages between institutions, but it can be overly general in describing the risk exposures of one entity to another. TRC provides a more focused view of key systemic risks and richer financial intuition, but it may miss important linkages between financial institutions. Integrating these two methods can provide information on key relationships between institutions that may become relevant during periods of systemic stress. The authors demonstrate this approach using the exposures of money market funds to major financial institutions during July 2011. The results for their example suggest that TRC networks can highlight both institutions and funds that may become distressed during a financial crisis.
The Wisdom of Twitter Crowds: Predicting Stock Market Reactions to FOMC Meetings via Twitter Feeds
Azar, Pablo D., and Andrew W. Lo (2016), The Wisdom of Twitter Crowds: Predicting Stock Market Reactions to FOMC Meetings via Twitter Feeds, Journal of Portfolio Management 42 (5), 123–134.
View abstract
Hide abstract
With the rise of social media, investors have a new tool for measuring sentiment in real time. However, the nature of these data sources raises serious questions about its quality. Because anyone on social media can participate in a conversation about markets—whether the individual is informed or not—these data may have very little information about future asset prices. In this article, the authors show that this is not the case. They analyze a recurring event that has a high impact on asset prices—Federal Open Market Committee (FOMC) meetings—and exploit a new dataset of tweets referencing the Federal Reserve. The authors show that the content of tweets can be used to predict future returns, even after controlling for common asset pricing factors. To gauge the economic magnitude of these predictions, the authors construct a simple hypothetical trading strategy based on this data. They find that a tweet-based asset allocation strategy outperforms several benchmarks—including a strategy that buys and holds a market index, as well as a comparable dynamic asset allocation strategy that does not use Twitter information.
Leibowitz, Martin, Andrew W. Lo, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel (2016), Q Group Panel Discussion: Looking to the Future, Financial Analysts Journal 72 (4), 17–25.
View abstract
Hide abstract
Moderator Martin Leibowitz asked a panel of industry experts—Andrew W. Lo, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel—what they saw as the most important issues in finance, especially as those issues relate to practitioners. Drawing on their vast knowledge, these panelists addressed topics such as regulation, technology, and financing society’s challenges; opacity and trust; the social value of finance; and future expected returns.
Buying Cures Versus Renting Health: Financing Health Care with Consumer Loans
Montazerhodjat, Vahid, David M. Weinstock, and Andrew W. Lo (2016), Buying Cures versus Renting Health: Financing Health Care with Consumer Loans, Science Translational Medicine 8 (327), 327ps6.
View abstract
Hide abstract
A crisis is building over the prices of new transformative therapies for cancer, hepatitis C virus infection, and rare diseases. The clinical imperative is to offer these therapies as broadly and rapidly as possible. We propose a practical way to increase drug affordability through health care loans (HCLs)—the equivalent of mortgages for large health care expenses. HCLs allow patients in both multipayer and single-payer markets to access a broader set of therapeutics, including expensive short-duration treatments that are curative. HCLs also link payment to clinical benefit and should help lower per-patient cost while incentivizing the development of transformative therapies rather than those that offer small incremental advances. Moreover, we propose the use of securitization—a well-known financial engineering method—to finance a large diversified pool of HCLs through both debt and equity. Numerical simulations suggest that securitization is viable for a wide range of economic environments and cost parameters, allowing a much broader patient population to access transformative therapies while also aligning the interests of patients, payers, and the pharmaceutical industry.
Montazerhodjat, Vahid, John J. Frishkopf, and Andrew W. Lo (2016), Financing Drug Discovery via Dynamic Leverage, Drug Discovery Today 21 (3), 410–414.
View abstract
Hide abstract
We extend the megafund concept for funding drug discovery to enable dynamic leverage in which the portfolio of candidate therapeutic assets is predominantly financed initially by equity, and debt is introduced gradually as assets mature and begin generating cash flows. Leverage is adjusted so as to maintain an approximately constant level of default risk throughout the life of the fund. Numerical simulations show that applying dynamic leverage to a small portfolio of orphan drug candidates can boost the return on equity almost twofold compared with securitization with a static capital structure. Dynamic leverage can also add significant value to comparable all-equity-financed portfolios, enhancing the return on equity without jeopardizing debt performance or increasing risk to equity investors.
Lo, Andrew W. (2016), What Is an Index?, Journal of Portfolio Management 42 (2), 21–36.
View abstract
Hide abstract
Technological advances in telecommunications, securities exchanges, and algorithmic trading have facilitated a host of new investment products that resemble theme-based passive indexes but which depart from traditional market-cap-weighted portfolios. I propose broadening the definition of an index using a functional perspective—any portfolio strategy that satisfies three properties should be considered an index: (1) it is completely transparent; (2) it is investable; and (3) it is systematic, i.e., it is entirely rules-based and contains no judgment or unique investment skill. Portfolios satisfying these properties that are not market-cap-weighted are given a new name: “dynamic indexes.” This functional definition widens the universe of possibilities and, most importantly, decouples risk management from alpha generation. Passive strategies can and should be actively risk managed, and I provide a simple example of how this can be achieved. Dynamic indexes also create new challenges of which the most significant is backtest bias, and I conclude with a proposal for managing this risk.
Law Is Code: A Software Engineering Approach to Analyzing the United States Code
Li, William, Pablo D. Azar, David Larochelle, Phil Hill, and Andrew W. Lo (2015), Law Is Code: A Software Engineering Approach to Analyzing the United States Code, Journal of Business & Technology Law 10 (2), 297–374.
View abstract
Hide abstract
The agglomeration of rules and regulations over time has produced a body of legal code that no single individual can fully comprehend. This complexity produces inefficiencies, makes the processes of understanding and changing the law difficult,and frustrates the fundamental principle that the law should provide fair notice to the governed. In this Article, we take a quantitative, unbiased, and software-engineering approach to analyze the evolution of the United States Code from 1926 to today. Software engineers frequently face the challenge of understanding and managing large, structured collections of instructions, directives, and conditional statements, and we adapt and apply their techniques to the U.S. Code over time. Our work produces insights into the structure of the U.S. Code as a whole, its strengths and vulnerabilities, and new ways of thinking about individual laws. For example, we identify the first appearance and spread of important terms in the U.S. Code like “whistleblower” and “privacy.” We also analyze and visualize the network structure of certain substantial reforms,including the Patient Protection and Affordable Care Act and the Dodd-Frank Wall Street Reform and Consumer Protection Act, and show how the interconnections of references can increase complexity and create the potential for unintended consequences. Our work is a timely illustration of computational approaches to law as the legal profession embraces technology for scholarship in order to increase efficiency and to improve access to justice.