Link to original article
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: General Thoughts on Secular Solstice, published by Jeffrey Heninger on March 24, 2024 on LessWrong.
I attended Secular Solstice in Berkeley last December.
My perspective is quite unusual: I live in a rationalist group house and work at an AI safety office, but I also am a Christian and attend church every week.[1]
I was originally not planning on going to Solstice, but a decent number of people (~5) told me in person that they would be particularly interested in my opinions of it. I realized that I was interested in learning what I would think of it too, so I went.
I took notes on my thoughts throughout the service.[2] This blog post is my broader thoughts on the experience. I also have blog posts for a fun little correction to one of the songs and my detailed notes & commentary.
Overarching Narrative
I do not agree with the overarching narrative presented at Solstice.
There is a narrative in my tradition about people becoming humble and turning to God. You can choose to be humble or you can be "compelled to be humble" by the difficult circumstances in life. I'm not super fond of this description because being humble and turning to God is always a choice. But there is some truth in it: many people do find themselves relying on God more and developing a deeper relationship with Him through the more difficult times in their lives.
The overarching narrative of Solstice felt like a transmogrified version of being compelled to be humble. The descent into darkness recognizes the problems of the human condition. Then, instead of turning to humility, it turns to a fulness of pride. We, humanity, through our own efforts, will solve all our problems, and become the grabby aliens we hope to be.
There is some caution before the night, learning to accept things we cannot change, but this caution melts away before the imagined light of the Great Transhumanist Future.
AI X-Risk and AI Transhumanism
Existential Risk
A major cause for concern leading into the night was existential risk from AI: the chance that future artificial intelligence systems might kill everyone. This was talked about more than any other problem.
I expect that the organizers and speakers of Solstice are significantly more doomy than the audience.[3] The audience itself probably has selection effects that make it more doomy than AI researchers, or forecasters, or other groups of people who have thought about this possibility.
It is often the case that people's beliefs are more determined by what is normal for people around them to believe, rather than personally considering the relevant arguments and evidence themselves. This is a problem for intellectual communities, and should be countered by encouraging each person to know for yourself whether these beliefs are true. Organizers and speakers at Solstice have an unusually large power to establish what is normal to believe in the rationalist community.
They promoted increased concern about AI x-risk in the community, not by arguing for this belief but by treating it as common knowledge.[4] Maybe they believe that this is justified, but it felt to me like a Dark Art of Persuasion.
Transhumanism
Solstice also promoted the Great Transhumanist Future. What exactly this involves was perhaps intentionally left vague, and mostly described in song. It involved a coder dismantling the sun, making branches of your presumably-uploaded self, streams of data across the galaxy, and computronium. This is not just transhumanism: it's AI-centered transhumanism.
There were also some parts of the transhumanism which were not explicitly computational: things like space colonization or human immortality. But overall, it felt like the route to hoped-for future ran through powerful AI.
This is ... not the future I hope for. I am probably more futuristic than most of the public, and am...
view more