Link to original article
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: "Epistemic range of motion" and LessWrong moderation, published by habryka on November 28, 2023 on LessWrong.
(Context for the reader: Gabriel reached out to me a bit more than a year ago to ask me to delete a few comments on this post by Jacob Hilton, who was working at OpenAI at the time. I referenced this in my recent dialogue with Olivia, where I quoted an email I sent to Eliezer about having some concerns about Conjecture partially on the basis of that interaction. We ended up scheduling a dialogue to talk about that and related stuff.)
You were interested in a dialogue, probably somewhat downstream of my conversation with Olivia and also some of the recent advocacy work you've been doing.
Yup.
Two things I'd like to discuss:
I was surprised by you (on a recent call) stating that you found LessWrong to be a good place for the Lying is Cowardice not Strategy post.
I think you misunderstand my culture. Especially around civility, and honesty.
Yeah, I am interested in both of the two things. I don't have a ton of context on the second one, so am curious about hearing a bit more.
Gabriel's principles for moderating spaces
About the second one:
I think people should be free to be honest in their private spaces.
I think people should be free to create their own spaces, enact their vision, and to the extent you participate in the space, you should help them.
If you invite someone to your place, you ought to not do things that would have caused them not to come if they knew ahead of time.
So, about my post and the OAI thing:
By 3, I feel ok writing my post on my blog. I feel ok with people dissing OAI on their blogs, and on their posts if you are ok with it (I take you as proxy for "person with vision for LW")
I feel much less ok about ppl dissing OAI on their own blog posts on LW. I assume that if they knew ahead of time, they would have been much less likely to participate.
I would have felt completely ok if you told me "I don't think your post has the tone required for LW, I want less adversariality / less bluntness / more charitability / more ingroupness"
How surprising are these to you?
Meta-comment: Would have been great to know that the thing with OAI shocked you enough to send a message to Eliezer about it.
Would have been much better from my point of view to talk about it publicly, and even have a dialogue/debate like this if you were already opened to it.
If you were already open to it, I should have offered. (I might have offered, but can't remember.)
Ah, ok. Let me think about this a bit.
I have thoughts on the three principles you outline, but I think I get the rough gist of the kind of culture you are pointing to without needing to dive into that.
I think I don't understand the "don't do things that will make people regret they came" principle. Like, I can see how it's a nice thing to aspire to, but if you have someone submit a paper to a journal, and then the paper gets reviewed and rejected as shoddy, then like, they probably regret submitting to you, and this seems good.
Similarly if I show up in a jewish community gathering or something, and I wasn't fully aware of all of the rules and guidelines they follow and this make me regret coming, then that's sad, but it surely wouldn't have been the right choice for them to break their rules and guidelines just because I was there.
I do think I don't really understand the "don't do things that will make people regret they came" principle. Like, I can see how it's a nice thing to aspire to, but if you have someone submit a paper to a journal, and then the paper gets reviewed and rejected as shoddy, then like, they probably regret submitting to you, and this seems good.
You mention 'the paper gets reviewed and rejected', but I don't think the comments on OAI post was much conditioned on the quality of the post....
view more