All of Charlie Rogers-Smith's Comments + Replies

Nice. What you wrote accords with my experience. In my own personal case, my relationship to EA changed quite substantially--and in the way you describe--when I transitioned from very online to being within a community.

Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can't justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:

  • Rationality: often a deep integration with your feelings is required to form accurate beliefs--paying attention to a note of confusion, or something you can't explain in rational terms yet. Indeed, som
... (read more)

Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.

If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.

But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.

I also talk about my experiences with this here, in response to Howie's comment on my self-love post. 

Here, I should probably stop and define toxic norms. I think a toxic norm is any rule where following it makes you feel like large parts of you are bad.

I talk about this a bit in a post I wrote recently on self-love. I think self-love is a much bigger deal than most people expect, and it can solve a lot of the problems discussed in this piece (on an individual level). 

1
Charlie Rogers-Smith
2y
I also talk about my experiences with this here, in response to Howie's comment on my self-love post. 

Most of it is just toxic social norms. These groups develop toxic social norms.

Idk how much I buy that these are "norms". I think most people who have been around for a while would strongly disagree with the notion that one should never stop thinking about their impact, for example. Yet it seems that enough (young) people have some relationship to these "norms" that they're worth talking about.

I don't know that I agree with the mechanisms Sasha proposes, but I buy a lot of the observations they're meant to explain. 

 

In particular, I don't think conferring status was a big part of the problem for me. It was more internal. The more dangerous thing was taking my beliefs really seriously, only allowing myself to care about saving the world and ruthlessly (and very naively) optimising for that, even to the detriment of things like my mental health.

and your words motivate me to do this more again.

Yay!

but I think I would do well to internalize that spending time like this is well spend.

You can test it maybe! It might not be better but it might be really beneficial. Sometimes people need to escape into distractions, and sometimes it's nice to be with our pain, especially when we have the tools to comfort ourselves. Good luck!

Hi Richenda. Thanks for posting this; a discussion on the value of direct work is long overdue!

Two main things come to mind. One is a consideration for retaining people, and the other on the choice of comparison class.

Retaining people - I agree with you that losing people is bad. A key consideration is which people you want to retain most. In A Model of an EA Group, I claim that:

Trying to get a few people all the way through the funnel is more important than getting every person to the next stage.

Since groups are time-constrained, they can do only put ... (read more)

3
Richenda
6y
Hi Charlie. Thanks for your reply. To be clear, I don’t suggest universally prioritising direct work over other activities, only that direct work (given its benefits) should be considered in some circumstances. Typically, I would expect this to involve EA groups running a portfolio of activities which includes direct work opportunities alongside other activities. In many cases, EA groups won’t be so strictly bottlenecked by sheer number of hours available to run activities, but rather by interest of attendees (and event organisers) or ideas for events, and so on. For example, there is likely a limit to the number of times that career workshops or 1-1 meetings can be repeated (especially in the case of medium-smaller groups), which may be met before organisers run of our time or energy to run any more events. This is particularly so if different kinds of events would engage different organisers to run them and attendees to attend them and engage them in different ways. I would also anticipate diminishing returns on core activities, such that even if, for example, career workshops or 1-1s are the highest impact activities (on average), on the margin additional different activities may be more impactful (as well as complementary to these other activities). That said, I'm happy to discuss the hypotheticals presented here. First, responding to your point that 'we should try to get a few people through the funnel'. On the one hand, it is precisely my point that there are high-potential, high talent individuals who won't go all the way through the funnel (or who will leave/regress/value drift, despite having passed through the funnel) precisely because there aren't sufficiently engaging opportunities for them to get their teeth into. On the other hand, while I agree that it is plausible that in some or even the majority of cases, a small number of high impact individuals will deliver more value than a large group of lower impact individuals, I am very wary of concludin
1
Richenda
6y
Thanks Charlie. Just posting to say I've seen this and will respond more fully soon!