Link to original article
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: What could a policy banning AGI look like?, published by TsviBT on March 13, 2024 on LessWrong.
[Caveat lector: I know roughly nothing about policy!]
Suppose that there were political support to really halt research that might lead to an unstoppable, unsteerable transfer of control over the lightcone from humans to AGIs. What government policy could exert that political value?
[That does sound relaxing.]
Banning AGI research specifically
This question is NOT ASKING ABOUT GENERALLY SLOWING DOWN AI-RELATED ACTIVITY. The question is specifically about what it could look like to ban (or rather, impose an indefinite moratorium on) research that is aimed at creating artifacts that are more capable in general than humanity.
So "restrict chip exports to China" or "require large vector processing clusters to submit to inspections" or "require evals for commercialized systems" don't answer the question.
The question is NOT LIMITED to policies that would be actually practically enforceable by their letter. Making AGI research illegal would slow it down, even if the ban is physically evadable; researchers generally want to think publishable thoughts, and generally want to plausibly be doing something good or neutral by their society's judgement.
If the FBI felt they had a mandate to investigate AGI attempts, even if they would have to figure out some only-sorta-related crime to actually charge, maybe that would also chill AGI research. The question is about making the societal value of "let's not build this for now" be exerted in the most forceful and explicit form that's feasible.
Some sorts of things that would more address the question (in the following, replace "AGI" with "computer programs that learn, perform tasks, or answer questions in full generality", or something else that could go in a government policy):
Make it illegal to write AGIs.
Make it illegal to pay someone if the job description explicitly talks about making AGIs.
Make it illegal to conspire to write AGIs.
Why ask this?
I've asked this question of several (5-10) people, some of whom know something about policy and have thought about policies that would decrease AGI X-risk. All of them said they had not thought about this question. I think they mostly viewed it as not a very salient question because there isn't political support for such a ban. Maybe the possibility has been analyzed somewhere that I haven't seen; links?
But I'm still curious because:
I just am. Curious, I mean.
Maybe there will be support later, at which point it would be good to have already mostly figured out a policy that would actually delay AGI for decades.
Maybe having a clearer proposal would crystallize more political support, for example by having something more concrete to rally around, and by having something for AGI researchers "locked in races" to coordinate on as an escape from the race.
Maybe having a clearer proposal would allow people who want to do non-AGI AI research to build social niches for non-AGI AI research, and thereby be less bluntly opposed to regulation on AGI specifically.
[other benefits of clarity]
Has anyone really been far even as decided to use?
There's a lot of problems with an "AGI ban" policy like this. I'm wondering, though, which problems, if any, are really dealbreakers.
For example, one problem is: How do you even define what "AGI" or "trying to write an AGI" is? I'm wondering how much this is actually a problem, though. As a layman, as far as I know there could be existing government policies that are somewhat comparably difficult to evaluate. Many judicial decisions related to crimes, as I vaguely understand it, depend on intentionality and belief - - e.g.
for a killing to be a murder, the killer must have intended to kill and must not have believed on reasonable grounds that zer life was imminent...
view more