The Nonlinear Library: EA Forum
Education
EA - What do staff at CEA believe? (Evidence from a rough cause prio survey from April) by Lizka
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: What do staff at CEA believe? (Evidence from a rough cause prio survey from April), published by Lizka on October 2, 2023 on The Effective Altruism Forum.In April, I ran a small and fully anonymous cause prioritization survey of CEA staff members at a CEA strategy retreat. I got 31 responses (out of around 40 people), and I'm summarizing the results here, as it seems that people sometimes have incorrect beliefs about "what CEA believes." (I don't think the results are very surprising, though.)Important notes and caveats:I put this survey together pretty quickly, and I wasn't aiming to use it for a public writeup like this (but rather to check how comfortable staff are talking about cause prioritization, start conversations among staff, and test some personal theories). (I also analyzed it quickly.) In many cases, I regret how questions were set up, but I was in a rush and am going with what I have in order to share something - please treat these conclusions as quite rough.For many questions, I let people select multiple answers. This sometimes produced slightly unintuitive or hard-to-parse results; numbers often don't add up unless you take this into account. (Generally, I think the answers aren't self-contradictory once this is taken into account.) Sometimes people could also input their own answers.People's views might have changed since April, and the team composition has changed.I didn't ask for any demographic information (including stuff like "Which team are you on?").I also asked some free-response questions, but haven't included them here.Rough summary of the results:Approach to cause prioritization: Most people at CEA care about doing some of their own cause prioritization, although most don't try to build up the bulk of their cause prioritization on their own.Approach to morality: About a third of respondents said that they're "very consequentialist," many said that they "lean consequentialist for decisions like what their projects should work on, but have a more mundane approach to daily life." Many also said that they're "big fans of moral uncertainty."Which causes should be "key priorities for EA": people generally selected many causes (median was 5), and most people selected a fairly broad range of causes. Two (of 30) respondents didn't choose any causes not commonly classified as "longtermist/x-risk-focused" (everyone else did choose at least one, though). The top selections were Mitigating existential risk, broadly (27), AI existential security (26), Biosecurity (global catastrophic risk focus) (25), Farmed animal welfare (22), Global health (21), Other existential or global catastrophic risk (15), Wild animal welfare (11), and Generically preparing for pandemics (8). (Other options on the list were Mental health, Climate change, Raising the sanity waterline / un-targeted improving institutional decision-making, Economic growth, and Electoral reform.)Some highlights from more granular questions:Most people selected "I think reducing extinction risks should be a key priority (of EA/CEA)" (27). Many selected "I think improving how the long-run future goes should be a key priority (of EA/CEA)" (17), and "I think future generations matter morally, but it's hard to affect them." (13)Most people selected "I think AI existential risk reduction should be a top priority for EA/CEA" (23) and many selected "I want to learn more in order to form my views and/or stop deferring as much" (17) and "I think AI is the single biggest issue humanity is facing right now" (13). (Some people also selected answers like "I'm worried about misuse of AI (bad people/governments, etc.), but misalignment etc. seems mostly unrealistic" and "I feel like it's important, but transformative developments / x-risk are decades away.")Most people (22) selected at least one o...
Create your
podcast in
minutes
It is Free