The Nonlinear Library: EA Forum
Education
EA - Report on the Desirability of Science Given New Biotech Risks by Matt Clancy
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Report on the Desirability of Science Given New Biotech Risks, published by Matt Clancy on January 17, 2024 on The Effective Altruism Forum.Should we seek to make our scientific institutions more effective? On the one hand, rising material prosperity has so far been largely attributable to scientific and technological progress. On the other hand, new scientific capabilities also expand our powers to cause harm. Last year I wrote a report on this issue, "The Returns to Science in the Presence of Technological Risks." The report focuses specifically on the net social impact of science when we take into account the potential abuses of new biotechnology capabilities, in addition to benefits to health and income.The main idea of the report is to develop an economic modeling framework that lets us tally up the benefits of science and weigh them against future costs. To model costs, I start with the assumption that, at some future point, a "time of perils" commences, wherein new scientific capabilities can be abused and lead to an increase in human mortality (possibly even human extinction).In this modeling framework, we can ask if we would like to have an extra year of science, with all the benefits it brings, or an extra year's delay to the onset of this time of perils. Delay is good in this model, because there is some chance we won't end up having to go through the time of perils at all.I rely on historical trends to estimate the plausible benefits to science. To calibrate the risks, I use various forecasts made in theExistential Risk Persuasion tournament, which asked a large number of superforecasters and domain experts several questions closely related to the concerns of this report. So you can think of the model as helping assess whether the historical benefits of science outweigh one set of reasonable (in my view) forecasts of risks.What's the upshot? From the report's executive summary:A variety of forecasts about the potential harms from advanced biotechnology suggest the crux of the issue revolves around civilization-ending catastrophes. Forecasts of other kinds of problems arising from advanced biotechnology are too small to outweigh the historic benefits of science.For example, if the expected increase in annual mortality due to new scientific perils is less than 0.2-0.5% per year (and there is no risk of civilization-ending catastrophes from science), then in this report's model, the benefits of science will outweigh the costs.I argue the best available forecasts of this parameter, from a large number of superforecasters and domain experts in dialogue with each other during the recent existential risk persuasion tournament, are much smaller than these break-even levels. I show this result is robust to various assumptions about the future course of population growth and the health effects of science, the timing of the new scientific dangers, and the potential for better science to reduce risks (despite accelerating them).On the other hand, once we consider the more remote but much more serious possibility that faster science could derail advanced civilization, the case for science becomes considerably murkier. In this case, the desirability of accelerating science likely depends on the expected value of the long-run future, as well as whether we think the forecasts of superforecasters or domain experts in the existential risk persuasion tournament are preferred.These forecasts differ substantially: I estimate domain expert forecasts for annual mortality risk are 20x superforecaster estimates, and domain expert forecasts for annual extinction risk are 140x superforecaster estimates.The domain expert forecasts are high enough, for example, that if we think the future is "worth" more than 400 years of current social welfare, in one version of my mode...
Create your
podcast in
minutes
It is Free