Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can't justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:
Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.
If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.
But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.
Here, I should probably stop and define toxic norms. I think a toxic norm is any rule where following it makes you feel like large parts of you are bad.
I talk about this a bit in a post I wrote recently on self-love. I think self-love is a much bigger deal than most people expect, and it can solve a lot of the problems discussed in this piece (on an individual level).
Most of it is just toxic social norms. These groups develop toxic social norms.
Idk how much I buy that these are "norms". I think most people who have been around for a while would strongly disagree with the notion that one should never stop thinking about their impact, for example. Yet it seems that enough (young) people have some relationship to these "norms" that they're worth talking about.
I don't know that I agree with the mechanisms Sasha proposes, but I buy a lot of the observations they're meant to explain.
In particular, I don't think conferring status was a big part of the problem for me. It was more internal. The more dangerous thing was taking my beliefs really seriously, only allowing myself to care about saving the world and ruthlessly (and very naively) optimising for that, even to the detriment of things like my mental health.
and your words motivate me to do this more again.
Yay!
but I think I would do well to internalize that spending time like this is well spend.
You can test it maybe! It might not be better but it might be really beneficial. Sometimes people need to escape into distractions, and sometimes it's nice to be with our pain, especially when we have the tools to comfort ourselves. Good luck!
Hi Richenda. Thanks for posting this; a discussion on the value of direct work is long overdue!
Two main things come to mind. One is a consideration for retaining people, and the other on the choice of comparison class.
Retaining people - I agree with you that losing people is bad. A key consideration is which people you want to retain most. In A Model of an EA Group, I claim that:
Trying to get a few people all the way through the funnel is more important than getting every person to the next stage.
Since groups are time-constrained, they can do only put ...
Nice. What you wrote accords with my experience. In my own personal case, my relationship to EA changed quite substantially--and in the way you describe--when I transitioned from very online to being within a community.