The Forum is getting a bit swamped with discussions about Bostrom's email and apology. We’re making this thread where you can discuss the topic.
All other posts on this topic will be marked as “Personal Blog” — people who opt in or have opted into seeing “Personal Blog” posts will see them on the Frontpage, but others won’t; they’ll see them only in Recent Discussion or in All Posts. (If you want to change your "Personal Blog" setting, you can do that by following the instructions here.)
(Please also feel free to give us feedback on this thread and approach. This is the first time we’ve tried this in response to events that dominate Forum discussion. You can give feedback by commenting on the thread, or by reaching out to forum@effectivealtruism.org.)
Please also note that we have received an influx of people creating accounts to cast votes and comments over the past week, and we are aware that people who feel strongly about human biodiversity sometimes vote brigade on sites where the topic is being discussed. Please be aware that voting and discussion about some topics may not be representative of the normal EA Forum user base.
If you choose to participate in this discussion, please remember Forum norms. Chiefly,
- Be kind.
- Stay civil, at the minimum. Don’t sneer or be snarky. In general, assume good faith. We may delete unnecessary rudeness and issue warnings or bans for it.
- Substantive disagreements are fine and expected. Disagreements help us find the truth and are part of healthy communication.
Please try to remember that most people on the Forum are here for collaborative discussions about doing good.
I would have to think more on this to have a super confident reply. See also my point in response to Geoffrey Miller elsewhere here--there are lots of considerations at play.
One view I hold, though, is something like "the optimal amount of self-censorship, by which I mean not always saying things that you think are true/useful, in part because you're considering the [personal/community-level] social implications thereof, is non-zero." We can of course disagree on the precise amount/contexts for this, and sometimes it can go too far. And by definition in all such cases you will think you are right and others wrong, so there is a cost. But I don't think it is automatically/definitionally bad for people to do that to some extent, and indeed much of progress on issues like civil rights, gay rights etc. in the US has resulted in large part from actions getting ahead of beliefs among people who didn't "get it" yet, with cultural/ideological change gradually following with generational replacement, pop culture changes, etc. Obviously people rarely think that they are in the wrong, but it's hard to be sure, and I don't think we [the world, EA] should be aiming for a culture where there are never repercussions for expressing beliefs that, in the speaker's view, are true. Again, that's consistent with people disagreeing about particular cases, just sharing my general view here.
This shouldn't only work in one ideological "direction" of course, which may be a crux in how people react to the above. Some may see the philosophy above as (exclusively) an endorsement of wokism/cancel culture etc. in its entirety/current form [insofar as that were a coherent thing, which I'm not sure it is]. While I am probably less averse to some of those things than the some LW/EAF readers, especially on the rationalist side side, I also think that people should remember that restraint can be positive in many contexts. For example, I am, in my effort to engage and in my social media activities lately, trying to be careful to be respectful to people who identify strongly with the communities I am critiquing, and have held back some spicy jokes (e.g. playing on the "I like this statement and think it is true" line which just begs for memes), precisely because I want to avoid alienating people who might be receptive to the object level points I'm making, and because I don't want to unduly egg on critiques by other folks on social media who I think sometimes go too far in attacking EAs, etc.