Q Group Panel Discussion: Looking to the Future
with Martin Leibowitz, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel, Financial Analysts Journal
Moderator Martin Leibowitz asked a panel of industry experts—Andrew W. Lo, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel—what they saw as the most important issues in finance, especially as those issues relate to practitioners. Drawing on their vast knowledge, these panelists addressed topics such as regulation, technology, and financing society’s challenges; opacity and trust; the social value of finance; and future expected returns.
Buying Cures Versus Renting Health: Financing Health Care with Consumer Loans
with Vahid Montazerhodjat & David M. Weinstock, Science Translational Medicine
A crisis is building over the prices of new transformative therapies for cancer, hepatitis C virus infection, and rare diseases. The clinical imperative is to offer these therapies as broadly and rapidly as possible. We propose a practical way to increase drug affordability through health care loans (HCLs)—the equivalent of mortgages for large health care expenses. HCLs allow patients in both multipayer and single-payer markets to access a broader set of therapeutics, including expensive short-duration treatments that are curative. HCLs also link payment to clinical benefit and should help lower per-patient cost while incentivizing the development of transformative therapies rather than those that offer small incremental advances. Moreover, we propose the use of securitization—a well-known financial engineering method—to finance a large diversified pool of HCLs through both debt and equity. Numerical simulations suggest that securitization is viable for a wide range of economic environments and cost parameters, allowing a much broader patient population to access transformative therapies while also aligning the interests of patients, payers, and the pharmaceutical
Financing Drug Discovery via Dynamic Leverage
with Vahid Montazerhodjat and John J. Frishkopf, Drug Discovery Today
We extend the megafund concept for funding drug discovery to enable dynamic leverage in which the portfolio of candidate therapeutic assets is predominantly financed initially by equity, and debt is introduced gradually as assets mature and begin generating cash flows. Leverage is adjusted so as to maintain an approximately constant level of default risk throughout the life of the fund. Numerical simulations show that applying dynamic leverage to a small portfolio of orphan drug candidates can boost the return on equity almost twofold compared with securitization with a static capital structure. Dynamic leverage can also add significant value to comparable all-equity-financed portfolios, enhancing the return on equity without jeopardizing debt performance or increasing risk to equity investors.
What Is An Index?
Journal of Portfolio Management
Technological advances in telecommunications, securities exchanges, and algorithmic trading have facilitated a host of new investment products that resemble theme-based passive indexes but which depart from traditional market-cap-weighted portfolios. I propose broadening the definition of an index using a functional perspective—any portfolio strategy that satisfies three properties should be considered an index: (1) it is completely transparent; (2) it is investable; and (3) it is systematic, i.e., it is entirely rules-based and contains no judgment or unique investment skill. Portfolios satisfying these properties that are not market-cap-weighted are given a new name: “dynamic indexes.” This functional definition widens the universe of possibilities and, most importantly, decouples risk management from alpha generation. Passive strategies can and should be actively risk managed, and I provide a simple example of how this can be achieved. Dynamic indexes also create new challenges of which the most significant is backtest bias, and I conclude with a proposal for managing this risk.
Law Is Code: A Software Engineering Approach to Analyzing the United States Code
with William Li, Pablo Azar, David Larochelle, Phil Hill, Journal of Business & Technology Law
The agglomeration of rules and regulations over time has produced a body of legal code that no single individual can fully comprehend. This complexity produces inefficiencies, makes the processes of understanding and changing the law difficult,and frustrates the fundamental principle that the law should provide fair notice to the governed. In this Article, we take a quantitative, unbiased, and software-engineering approach to analyze the evolution of the United States Code from 1926 to today. Software engineers frequently face the challenge of understanding and managing large, structured collections of instructions, directives, and conditional statements, and we adapt and apply their techniques to the U.S. Code over time. Our work produces insights into the structure of the U.S. Code as a whole, its strengths and vulnerabilities, and new ways of thinking about individual laws. For example, we identify the first appearance and spread of important terms in the U.S. Code like “whistleblower” and “privacy.” We also analyze and visualize the network structure of certain substantial reforms,including the Patient Protection and Affordable Care Act and the Dodd-Frank Wall Street Reform and Consumer Protection Act, and show how the interconnections of references can increase complexity and create the potential for unintended consequences. Our work is a timely illustration of computational approaches to law as the legal profession embraces technology for scholarship in order to increase efficiency and to improve access to justice.
Reply to “(Im)Possible Frontiers: A Comment”
with Thomas J. Brennan, Critical Finance Review
In Brennan and Lo (2010), a mean-variance efficient frontier is defined as “impossible” if every portfolio on that frontier has negative weights, which is incompatible with the Capital Asset Pricing Model (CAPM) requirement that the market portfolio is mean-variance efficient. We prove that as the number of assets n grows, the probability that a randomly chosen frontier is impossible tends to one at a geometric rate, implying that the set of parameters for which the CAPM holds is extremely rare. Levy and Roll (2014) argue that while such “possible”frontiers are rare, they are ubiquitous. In this reply, we show that this is not the case; possible frontiers are not only rare,but they occupy an isolated region of mean-variance parameter space that becomes increasingly remote as n increases. Ingersoll (2014) observes that parameter restrictions can rule out impossible frontiers, but in many cases these restrictions contradict empirical fact and must be questioned rather than blindly imposed.
Funding Translational Medicine via Public Markets: The Business Development Company
with Sandra M. Forman, Monica Shilling, Grace K. Sweeney, Journal of Investment Management
A business development company (BDC) is a type of closed-end investment fund with certain relaxed requirements that allow it to raise money in the public equity and debt markets, and can be used to fund multiple early-stage biomedical ventures, using financial diversification to de-risk translational medicine. By electing to be a “Regulated Investment Company” for tax purposes, a BDC can avoid double taxation on income and net capital gains distributed to its shareholders. BDCs are ideally suited for long-term investors in biomedical innovation, including: (i) investors with biomedical expertise who understand the risks of the FDA approval process, (ii) “banking entities,” now prohibited from investing in hedge funds and private equity funds by the Volcker Rule, but who are permitted to invest in BDCs, subject to certain restrictions, and (iii) retail investors, who traditionally have had to invest in large pharmaceutical companies to gain exposure to similar assets. We describe the history of BDCs, summarize the requirements for creating and managing them, and conclude with a discussion of the advantages and disadvantages of the BDC structure for funding biomedical innovation.
Hedge Funds: A Dynamic Industry In Transition
with Mila Getmansky, Peter A. Lee,
The hedge-fund industry has grown rapidly over the past two decades, offering investors unique investment opportunities that often reflect more complex risk exposures than those of traditional investments. In this article, we present a selective review of the recent academic literature on hedge funds as well as updated empirical results for this industry. Our review is written from several distinct perspectives: the investor’s, the portfolio manager’s, the regulator’s, and the academic’s. Each of these perspectives offers a different set of insights into the financial system, and the combination provides surprisingly rich implications for the Efficient Markets Hypothesis, investment management, systemic risk, financial regulation, and other aspects of financial theory and practice.
A Survey of Systemic Risk Analytics
with Dimitrios Bisias, Mark Flood, and Stavros Valavanis, Annual Review of Financial Economics 4, 255-296
We provide a survey of 31 quantitative measures of systemic risk in the economics and finance literature, chosen to span key themes and issues in systemic risk measurement and management. We motivate these measures from the supervisory, research, and data perspectives in the main text, and present concise definitions of each risk measure--including required inputs, expected outputs, and data requirements--in an extensive appendix. To encourage experimentation and innovation among as broad an audience as possible, we have developed open-source Matlab code for most of the analytics surveyed, which can be accessed through the Office of Financial Research (OFR) here.
A Computational View of Market Efficiency
with Jasmina Hasanhodzic and Emanuele Viola, Quantitative Finance 7, 1043-1050
We propose to study market efficiency from a computational viewpoint. Borrowing from theoretical computer science, we define a market to be efficient with respect to resources S (e.g., time, memory) if no strategy using resources S can make a profit. As a first step, we consider memory-m strategies whose action at time t depends only on the m previous observations at times t - m,...,t - 1. We introduce and study a simple model of market evolution, where strategies impact the market by their decision to buy or sell. We show that the effect of optimal strategies using memory m can lead to "market conditions" that were not present initially, such as (1) market bubbles and (2) the possibility for a strategy using memory m' > m to make a bigger profit than was initially possible. We suggest ours as a framework to rationalize the technological arms race of quantitative trading firms.