Business Models to Cure Rare Disease: A Case Study of Solid Biosciences
with Esther Kim, Journal of Investment Management 14(2016), 87–101.
Duchenne muscular dystrophy (DMD) is a rare genetic disorder affecting thousands of individuals, mainly young males, worldwide. Currently, the disease has no cure, and is fatal in all cases. Advances in our understanding of the disease and innovations in basic science have recently allowed biotechnology companies to pursue promising treatment candidates for the disease, but so far, only one drug with limited application has achieved FDA approval. In this case study, we profile the work of an early-stage life sciences company, Solid Biosciences, founded by a father of a young boy with DMD. In particular,
we discuss Solid’s one-disease focus and its strategy to treat the disease with a diversified portfolio of approaches. The company is currently building a product pipeline consisting of genetic interventions, small molecules and biologics, and assistive devices, each aimed at addressing a different aspect of DMD. We highlight the potential for Solid’s business model and portfolio to achieve breakthrough treatments for the DMD patient community.
Hedge fund holdings and stock market efficiency
with Charles Cao, Bing Liang, and Lubomir Petrasek, Review of Asset Pricing Studies, 2017, forthcoming
We examine the relation between changes in hedge fund equity holdings and measures of informational efficiency of stock prices derived from intraday transactions as well as daily data. On average, hedge fund ownership of stocks leads to greater improvements in price efficiency than mutual fund or bank ownership, especially for stocks held by hedge funds with high portfolio turnover and superior security selection skills. However, stocks held by hedge funds experienced large declines in price efficiency in the last quarter of 2008, particularly if the funds were connected to Lehman Brothers as a prime broker and used leverage in combination with lenient redemption terms.
Use of Bayesian Decision Analysis to Minimize Harm in Patient-Centered Randomized Clinical Trials in Oncology
with Vahid Montazerhodjat; Shomesh E. Chaudhuri; Daniel J. Sargent, JAMA Oncology (2017)
There is general agreement in the biomedical community that the development of therapies for certain diseases should take priority. This ethic has motivated
legislative initiatives, such as the Orphan Drug Act of 1983, and underpins several important innovations in regulatory approval processes, such as the US Food and Drug Administration’s (FDA) fast-track, breakthrough-therapy,
accelerated-approval, and priority-review designations. However, none of these innovations directly address the critical issue of how to incorporate the patient’s perspective in deciding whether a drug candidate should be approved or not.
The current approach in clinical trial design is to minimize the chance of ineffective treatment caused by a type 1 error, that is, a false-positive result. However, the arbitrary nature of the threshold for the probability of type 1 error, alpha, raises an ethical question about its justification.A 2.5% threshold may
not be appropriate for terminal illnesses that have no effective therapies; such patients may prefer to take a bigger chance on a false-positive result, even if the likelihood of an effective therapy is small. To quote the noted biostatistician
Donald Berry, “We should also focus on patient values, not just P values.”
Health, Wealth, and the 21st Century Cures Act
with Tomas Philipson and Andrew von Eschenbach, JAMA Oncology 2(2016), 17–18.
Americans are increasingly apprehensive about our future, so it is inspiring when Congress produces legislation intended to both enhance our health and expand our economy. The 21st Century Cures Act, recently passed by the House with an impressive bipartisan majority vote of 344 to 77, intends to accelerate the many-step process of drug discovery and development, from basic scientific research to clinical development to delivery, distribution, and ongoing monitoring. Among other things, the legislation boosts National Institute of Health funding, dramatically speeds up the US Food and Drug Administration (FDA) approval process, and aims to make use of new information technology to better monitor the performance of medical products after they reach the market. This landmark bill now awaits a comparable piece of legislation being developed by the Senate Health Education, Labor, and Pensions Committee. Together, they will transform the biomedical ecosystem and provide the foundation for the next several decades of innovative life-saving and health-enhancing solutions for our nation and the world.
TRC Networks and Systemic Risk
with Roger Stein, Journal of Alternative Investment 18(2016), 52–67.
The authors introduce a new approach to identifying and monitoring systemic risk that combines network analysis and tail risk contribution (TRC). Network analysis provides great flexibility in representing and exploring linkages between institutions, but it can be overly general in describing the risk exposures of one entity to another. TRC provides a more focused view of key systemic risks and richer financial intuition, but it may miss important linkages between financial institutions. Integrating these two methods can provide information on key relationships between institutions that may become relevant during periods of systemic stress. The authors demonstrate this approach using the exposures of money market funds to major financial institutions during July 2011. The results for their example suggest that TRC networks can highlight both institutions and funds that may become distressed during a financial crisis.
The Wisdom of Twitter Crowds: Predicting Stock Market Reactions to FOMC Meetings via Twitter FeedsResearch Publication | 2016
with with Pablo D. Azar, Journal of Portfolio Management 42(2016), 123–134.
With the rise of social media, investors have a new tool for measuring sentiment in real time. However, the nature of these data sources raises serious questions about its quality. Because anyone on social media can participate in a conversation about markets—whether the individual is informed or not—these data may have very little information about future asset prices. In this article, the authors show that this is not the case. They analyze a recurring event that has a high impact on asset prices—Federal Open Market Committee (FOMC) meetings—and exploit a new dataset of tweets referencing the Federal Reserve. The authors show that the content of tweets can be used to predict future returns, even after controlling for common asset pricing factors. To gauge the economic magnitude of these predictions, the authors construct a simple hypothetical trading strategy based on this data. They find that a tweet-based asset allocation strategy outperforms several benchmarks—including a strategy that buys and holds a market index, as well as a comparable dynamic asset allocation strategy that does not use Twitter information.
Q Group Panel Discussion: Looking to the Future
with Martin Leibowitz, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel, Financial Analysts Journal
Moderator Martin Leibowitz asked a panel of industry experts—Andrew W. Lo, Robert C. Merton, Stephen A. Ross, and Jeremy Siegel—what they saw as the most important issues in finance, especially as those issues relate to practitioners. Drawing on their vast knowledge, these panelists addressed topics such as regulation, technology, and financing society’s challenges; opacity and trust; the social value of finance; and future expected returns.
Buying Cures Versus Renting Health: Financing Health Care with Consumer Loans
with Vahid Montazerhodjat and David M. Weinstock, Science Translational Medicine 8(2016), 327ps6.
A crisis is building over the prices of new transformative therapies for cancer, hepatitis C virus infection, and rare diseases. The clinical imperative is to offer these therapies as broadly and rapidly as possible. We propose a practical way to increase drug affordability through health care loans (HCLs)—the equivalent of mortgages for large health care expenses. HCLs allow patients in both multipayer and single-payer markets to access a broader set of therapeutics, including expensive short-duration treatments that are curative. HCLs also link payment to clinical benefit and should help lower per-patient cost while incentivizing the development of transformative therapies rather than those that offer small incremental advances. Moreover, we propose the use of securitization—a well-known financial engineering method—to finance a large diversified pool of HCLs through both debt and equity. Numerical simulations suggest that securitization is viable for a wide range of economic environments and cost parameters, allowing a much broader patient population to access transformative therapies while also aligning the interests of patients, payers, and the pharmaceutical
What Is An Index?
Journal of Portfolio Management 42(2016), 21–36.
Technological advances in telecommunications, securities exchanges, and algorithmic trading have facilitated a host of new investment products that resemble theme-based passive indexes but which depart from traditional market-cap-weighted portfolios. I propose broadening the definition of an index using a functional perspective—any portfolio strategy that satisfies three properties should be considered an index: (1) it is completely transparent; (2) it is investable; and (3) it is systematic, i.e., it is entirely rules-based and contains no judgment or unique investment skill. Portfolios satisfying these properties that are not market-cap-weighted are given a new name: “dynamic indexes.” This functional definition widens the universe of possibilities and, most importantly, decouples risk management from alpha generation. Passive strategies can and should be actively risk managed, and I provide a simple example of how this can be achieved. Dynamic indexes also create new challenges of which the most significant is backtest bias, and I conclude with a proposal for managing this risk.
Law Is Code: A Software Engineering Approach to Analyzing the United States Code
with William Li, Pablo Azar, David Larochelle, Phil Hill, Journal of Business & Technology Law 10(2015), 297–374.
The agglomeration of rules and regulations over time has produced a body of legal code that no single individual can fully comprehend. This complexity produces inefficiencies, makes the processes of understanding and changing the law difficult,and frustrates the fundamental principle that the law should provide fair notice to the governed. In this Article, we take a quantitative, unbiased, and software-engineering approach to analyze the evolution of the United States Code from 1926 to today. Software engineers frequently face the challenge of understanding and managing large, structured collections of instructions, directives, and conditional statements, and we adapt and apply their techniques to the U.S. Code over time. Our work produces insights into the structure of the U.S. Code as a whole, its strengths and vulnerabilities, and new ways of thinking about individual laws. For example, we identify the first appearance and spread of important terms in the U.S. Code like “whistleblower” and “privacy.” We also analyze and visualize the network structure of certain substantial reforms,including the Patient Protection and Affordable Care Act and the Dodd-Frank Wall Street Reform and Consumer Protection Act, and show how the interconnections of references can increase complexity and create the potential for unintended consequences. Our work is a timely illustration of computational approaches to law as the legal profession embraces technology for scholarship in order to increase efficiency and to improve access to justice.