The Nonlinear Library: EA Forum
Education
EA - Exaggerating the risks (Part 13: Ord on Biorisk) by Vasco Grilo
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Exaggerating the risks (Part 13: Ord on Biorisk), published by Vasco Grilo on December 31, 2023 on The Effective Altruism Forum.This is a crosspost to Exaggerating the risks (Part 13: Ord on Biorisk), as published by David Thorstad on 29 December 2023.This massive democratization of technology in biological sciences ⦠is at some level fantastic. People are very excited about it. But this has this dark side, which is that the pool of people that could include someone who has ⦠omnicidal tendencies grows many, many times larger, thousands or millions of times larger as this technology is democratized, and you have more chance that you get one of these people with this very rare set of motivations where they're so misanthropic as to try to cause ⦠worldwide catastrophe.Toby Ord,80,000 Hours InterviewListen to this post [there is an option for this in the original post]1. IntroductionThis is Part 13 of my seriesExaggerating the risks. In this series, I look at some places where leading estimates of existential risk look to have been exaggerated.Part 1 introduced the series. Parts 2-5 (sub-series: "Climate risk") looked at climate risk. Parts 6-8 (sub-series: "AI risk") looked at the Carlsmith report on power-seeking AI.Parts9,10 and11 began anew sub-series on biorisk. InPart 9, we saw that many leading effective altruists give estimates between 1.0-3.3% for the risk of existential catastrophe from biological causes by 2100. I think these estimates are a bit too high.Because I have had a hard time getting effective altruists to tell me directly what the threat is supposed to be, my approach was to first survey the reasons why many biosecurity experts, public health experts, and policymakers are skeptical of high levels of near-term existential biorisk. Parts9,10 and11 gave a dozen preliminary reasons for doubt, surveyed at the end ofPart 11.The second half of my approach is to show that initial arguments by effective altruists do not overcome the case for skepticism.Part 12 examined a series of risk estimates by Piers Millett and Andrew Snyder-Beattie. We saw, first, that many of these estimates are orders of magnitude lower than those returned by leading effective altruists and second, that Millett and Snyder-Beattie provide little in the way of credible support for even these estimates.Today's post looks at Toby Ord's arguments in The Precipice for high levels of existential risk. Ord estimates the risk of irreversible existential catastrophe by 2100 from naturally occurring pandemics at 1/10,000, and the risk from engineered pandemics at a whopping 1/30. That is a very high number. In this post, I argue that Ord does not provide sufficient support for either of his estimates.2. Natural pandemicsOrd begins with a discussion of natural pandemics. I don't want to spend too much time on this issue, since Ord takes the risk of natural pandemics to be much lower than that of engineered pandemics. At the same time, it is worth asking how Ord arrives at a risk of 1/10,000.Effective altruists effectively stress that humans have trouble understanding how large certain future-related quantities can be. For example, there might be 1020, 1050 or even 10100 future humans. However, effective altruists do not equally stress how small future-related probabilities can be. Risk probabilities can be on the order of 10-2 or even 10-5, but they can also be a great deal lower than that: for example, 10-10, 10-20, or 10-50 [for example, a terrorist attack causing human extinction is astronomically unlikely on priors].Most events pose existential risks of this magnitude or lower, so if Ord wants us to accept that natural pandemics have a 1/10,000 chance of leading to irreversible existential catastrophe by 2100, Ord owes us a solid argument for this conclusion. It ...
Create your
podcast in
minutes
It is Free