This is a linkpost for https://sashachapin.substack.com/p/your-intelligent-conscientious-in

I really liked the piece. It resonated with my experiences in EA. I don't know that I agree with the mechanisms Sasha proposes, but I buy a lot of the observations they're meant to explain. 

I asked Sasha for his permission to post this (and heavily quote it). He said that he hopes it comes off as more than a criticism of EA/rationality specifically--it's more a "general nerd social patterns" thing. I only quoted parts very related to EA, which doesn't help assuage his worry :( 

There's more behind the link :) 

 

So, I’ve noticed that a significant number of my friends in the Rationalist and Effective Altruist communities seem to stumble into pits of despair, generally when they structure their lives too rigidly around the in-group’s principles. Like, the Rationalists become miserable by trying to govern their entire lives through nothing but rationality, and the EAs feel bad by holding themselves to an impossible standard of ethics. [...]

This is a real shame, because these are some of the most productive, original, intelligent, charming, strange people in my world. When they’re depressed, they do their important work less effectively. And when they burn out, it is a major loss. And the burnouts can be dramatic—like, quit your life, do nothing for years, have no grounding principles whatsoever, eventually hallucinogen yourself back into self-acceptance. (That’s a reasonable way to spend time, but there’s probably a healthier middle way that doesn’t involve total personal collapse.) [...]

Today I realized that it’s generally much simpler than I thought previously. Most of it is just toxic social norms. These groups develop toxic social norms. In the Rationalist community, one toxic norm is something like, “you must reject beliefs that you can’t justify, sentiments that don’t seem rational, and woo things.” In the EA community, one toxic norm is something like, “don’t ever indulge in Epicurean style, and never, ever stop thinking about your impact on the world.”

Generally, toxic social norms don’t develop intentionally, nobody wants them to happen, they’re not written down, and nobody enforces them explicitly. (The intentional development of toxic social norms is otherwise known as founding a cult.) What happens is that there are positive social norms, like, “talking about epistemics and being curious about beliefs is cool,” or “try to be intentional about the positive impact you can have on the world.” These norms are great! But then, the group acts like a group, which is to say, people confer status depending on level of apparent adherence to values. This leads insecure people who completely depend on the group to over-identify with the set of values, to the extent that even slightly contrary actions become forbidden. Not forbidden in the like “we’ll arrest you” way, but in the like “everyone in the room immediately looks at you like you’re being rude if you talk about spirituality” way. 

And then the second, more sinister stage occurs—the point at which these toxic norms are internalized such that they apply to you when you’re in a room alone. As Wittgenstein noted, it’s hard to tell where aesthetics end and ethics begin; it can start to feel unethical, like, dirty, to perform behaviors your peers would think distasteful. Toxic norms eventually pervade completely, to the point where you don’t even want to think bad thoughts. 

Sometimes—often—these forbidden thoughts/actions aren’t even contrary to the explicit values. They just don’t fit in with the implied group aesthetic, which is often a much stricter, more menacing guideline, all the more so because it’s a collective unwritten fiction. “Rationality is cool” becomes “rationality is the best framework” becomes “Rationalist and Rationalist-flavored stuff is a better use of your time than anything else” becomes “it’s uncool if you want to spend a lot of time doing stuff that has nothing to do with testable beliefs, or our favorite issues.” This is all unintentional and implicit. No Rationalist has ever said, to my knowledge, that you shouldn’t write poetry, but a few Rationalists have told me that they feel like they shouldn’t make weird art because it’s dumb and un-Rationalist to do so—they feel they ought to produce useful thoughts instead, even though their hearts are trying to steer them somewhere else. I point out to them that Scott Alexander wrote a fantasy novel for fun, but somehow this isn’t persuasive enough.

Here, I should probably stop and define toxic norms. I think a toxic norm is any rule where following it makes you feel like large parts of you are bad. The EA version is thinking that you’re evil if your soul/body/emotions are crying out for you to relax, slack off a bit, and spend money on yourself, because you ought to be spending every possible moment working on human flourishing. I’ve heard tales of people struggling with their decision to buy a tasty snack rather than donate $5 to charity, and, more worryingly, people feeling guilty that they want to have children, since that would distract them from the work of Improving Humanity. This obviously leads to burnout and self-loathing. Meanwhile, the Rationalist version is thinking that you’re stupid and not worth talking to if you yearn for the spiritual/aesthetic/woo/non-justifiable, or if you haven’t been able to come to grips with your issues through rational means. This leads to emotional damage being ignored, intuition being dismissed, and systematizing being preferred inappropriately above all other modes of thinking and feeling.

One sign of toxic social norms is if your behavior does deviate from the standard, you feel that the only way of saving face is through explaining your behavior via the group values. Like, if you watch the Bachelor all the time, and one of your smart peers finds out about that, you might find yourself hastily explaining that the series is enjoyable to you as an applied experiment in evolutionary psychology, when, in fact, you just like social drama because watching humans freak out is fun. I will never forget hearing a Rationalist friend ask a non-Rationalist friend whether he loved riding motorcycles because it was an experiment in social status, rather than, y’know, vroom vroom fun thing go fast.

I’m not mentioning these communities because I think they’re extra toxic or anything, by the way. They’re probably less toxic than the average group, and a lot of their principles are great. Any set of principles, if followed too strictly and assigned too much social value, can become a weird evil super-ego that creeps into the deepest crevices of your psyche. (One must imagine the serene Yogi seized with crippling shame over their perfectly normal road rage.) These groups are just the ones I’m most familiar with right now, and thus the places where I see these patterns most often. [...]

Also, these norms aren’t toxic for everyone! There are a few people who are, in fact, happiest when they’re entirely, or almost entirely, devoted to the fancy intellectual principles of a specialized group. But this is not most people. And this can actually compound the problem! If there are people in the group who are perfect examples of the desired behavior, they can be positive exemplars, but also negative exemplars—constant reminders that you are falling short. (Also, certain group leaders can quietly, and probably unintentionally, inflect the norms in a subtle way, thus accentuating the degree to which they are seen as exemplary, and the degree to which others are seen as inferior.)

This is, perhaps, an inevitable danger for nerdy people. For lots of intellectual weird people that don’t fit in, their first social stage is rejection from society in general, and then, later on, their second social stage is finding understanding in a tightly-knit subculture. And they cling to this subculture like a life-raft and are willing—happy, even—to initially reject any parts of themselves that don’t fit within this new community. And their new peers, unintentionally, facilitate this rejection. They don’t feel that this is toxic, because they feel like they’ve already seen what social toxicity is: it’s the prime normie directive that we learn in school: don’t be weird, ever. [...]

And the people being deferred to—the senior members of the group—don’t want this dynamic at all, but they don’t necessarily notice that it’s happening, because the outward manifestation of this is people being really impressed by you. Like, if you’re big in the EA scene, and a young freshly minted EA can’t stop talking about how excited they are to do good, and how inspired they are by your virtuousness, there’s maybe no obvious sign that they’ve started rejecting every part of themself that is not congruent to this new identity. You would have no reason to worry about that. You would probably just feel good, and glad that your principles are so convincing. So it’s hard to even see this issue sometimes, let alone figure out how to solve it. (Although I’ve heard from Rationalist luminary Aella that some Rationalists are, indeed, beginning to take it seriously, which is great.)

I don’t know whether all of this can be avoided entirely. Part of it is just growing up. It’s regular Kegan Stage 4 stuff. You conceive of who you are by seeing the world through some epistemic/moral lens, usually the one relied upon by the group who abuses you least. Eventually, you notice the flaws in that lens, and then you become your own thing, clearly related to the group, but distinct from it, not easily captured by any label or list of properties. 

49

0
0

Reactions

0
0

More posts like this

Comments22
Sorted by Click to highlight new comments since: Today at 3:44 PM

you must reject beliefs that you can’t justify, sentiments that don’t seem rational, and woo things.

This isn’t a toxic social norm. This is the point of rationality, is it not?

Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can't justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:

  • Rationality: often a deep integration with your feelings is required to form accurate beliefs--paying attention to a note of confusion, or something you can't explain in rational terms yet. Indeed, sometimes it is harmful to impose "rationality" constraints too early, because you will tend to lose the information that can't immediately comply with those constraints. Another example is defying social norms because one cannot justify them, only to later realise that they served some important function.
  • EA:  Burnout; depression.

Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.

If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.

But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.

There are a few different ways of interpreting the quote, but there's a concept of public positions and private guts. Public positions are ones that you can justify in public if pressed on, while private guts are illegible intuitions you hold which may nonetheless be correct - e.g. an expert mathematician may have a strong intuition that a particular proof or claim is correct, which they will then eventually translate to a publicly-verifiable proof. 

As far as I can tell, lizards probably don’t have public positions, but they probably do have private guts. That suggests those guts are good for predicting things about the world and achieving desirable world states, as well as being one of the channels by which the desirability of world states is communicated inside a mind. It seems related to many sorts of ‘embodied knowledge’, like how to walk, which is not understood from first principles or in an abstract way, or habits, like adjective order in English. A neural network that ‘knows’ how to classify images of cats, but doesn’t know how it knows (or is ‘uninterpretable’), seems like an example of this. “Why is this image a cat?” -> “Well, because when you do lots of multiplication and addition and nonlinear transforms on pixel intensities, it ends up having a higher cat-number than dog-number.” This seems similar to gut senses that are difficult to articulate; “why do you think the election will go this way instead of that way?” -> “Well, because when you do lots of multiplication and addition and nonlinear transforms on environmental facts, it ends up having a higher A-number than B-number.” Private guts also seem to capture a category of amorphous visions; a startup can rarely write a formal proof that their project will succeed (generally, if they could, the company would already exist). The postrigorous mathematician’s hunch falls into this category, which I’ll elaborate on later.

As an another example, in the recent dialog on AGI alignment, Yudkowsky frequently referenced having strong intuitions about how minds work that come from studying specific things in detail (and from having "done the homework"), but which he does not know how to straightforwardly translate into a publicly justifiable argument.

Private guts are very important and arguably the thing that mostly guides people's behavior, but they are often also ones that the person can't justify. If a person felt like they should reject any beliefs they couldn't justify, they would quickly become incapable of doing anything at all.

Separately, there are also lots of different claims that seem (or even are) irrational but are pointing to true facts about the world.

If this is what the line was saying, I agree. But it’s not, and having intuitions & a track record (or some reason to believe) those intuitions correlate with reality, and useful but known to be not true models of the world is a far cry from having unjustified beliefs & believing in woo, and the lack of these is what the post actually claims is the toxic social norm in rationality.

What makes you think it isn't? To me it seems both like a reasonable interpretation of the quote (private guts are precisely the kinds of positions you can't necessarily justify, and it's talking about having beliefs you can't justify) as well as a dynamic that feels like one that I recognize as one that has been occasionally present in the community. Fortunately posts like the one about private guts have helped push back against it.

Even if this interpretation wasn't actually the author's intent, choosing to steelman the claim in that way turns the essay into a pretty solid one, so we might as well engage with the strongest interpretation of it.

What makes you think it isn't? To me it seems both like a reasonable interpretation of the quote (private guts are precisely the kinds of positions you can't necessarily justify, and it's talking about having beliefs you can't justify) as well as a dynamic that feels like one that I recognize as one that has been occasionally present in the community.

Because it also mentions woo, so I think it’s talking about a broader class if unjustified beliefs than you think.

Even if this interpretation wasn't actually the author's intent, choosing to steelman the claim in that way turns the essay into a pretty solid one, so we might as well engage with the strongest interpretation of it.

I agree, but in that case you should say make it clear how your interpretation differs from the author’s. If you don’t, then it looks like a motte-bailey is happening (where the bailey is “rationalists should be more accepting of woo & other unjustified beliefs”, and the bailey is “oh no! I/they really just mean you shouldn’t completely ignore gut judgements, and occasionally models can be wrong in known ways but still useful”), or you may miss out on reasons the post-as-is doesn’t require your reformulation to be correct.

Because it also mentions woo, so I think it’s talking about a broader class if unjustified beliefs than you think.

My earlier comment mentioned that "there are also lots of different claims that seem (or even are) irrational but are pointing to true facts about the world." That was intended to touch upon "woo"; e.g. meditation used to be, and to some extent still is, considered "woo", but there nonetheless seem to be reasonable grounds to think that there's nonetheless something of value to be found in meditation (despite there also being various crazy claims around it).

My above link mentions a few other examples (out-of-body experiences, folk traditions, "Ki" in martial arts) that have claims around them that are false if taken as the literal truth, but are still pointing to some true aspect of the world. Notably, a policy of "reject all woo things" could easily be taken to imply rejecting all such things as superstition that's not worth looking at, thus missing out on the parts of the woo that were actually valuable.

IME, the more I look into them, the more I come to find that "woo" things that I'd previously rejected as not worth looking at because of them being obviously woo and false, are actually pointing to significantly valuable things. (Even if there is also quite a lot of nonsense floating around those same topics.)

I agree, but in that case you should say make it clear how your interpretation differs from the author’s. 

That's fair.

I work for CEA, but everything here is my personal opinion.

"General nerd social patterns" sounds right to me.

I've seen a lower proportion of people in "pits of despair" in EA than in other demographically similar communities I'm familiar with (social activists, serious gamers). The ways in which people choose to evaluate themselves differ, but there are always many people who feel inadequate relative to their own standards (whether that's about upholding social justice norms or maintaining a certain winrate on the ladder). I think Sasha is describing the human condition (or at least the human condition among people who care a lot about certain social groups — "nerds", I guess?) more than anything inherent to EA/rationality. (And it sounds like Sasha would agree.)

This kind of observation about any group also suffers from... observer effects? (Not sure that's the right term to use.) There is a phenomenon where the most visible people in a group are often the most involved, and are more likely to experience personal difficulties that drive extreme levels of involvement. Another term people use for this is "very online".

Having worked with lots and lots of people across lots and lots of EA orgs, I rarely see people who seem to be in "work until you drop" mode, compared to the more standard pattern of "work pretty hard for a pretty standard number of hours, mostly relax when not working". People at CEA get married, have kids, and take vacations at rates that seem normal to me.

(Obvious disclaimer: It's not always obvious when people are doing the work-until-you-drop thing. But in many cases, I'm also real-world or Facebook friends with these people and see them vacationing, partying, reading novels, sharing memes, and otherwise "indulging" in normal stuff.)

However, when I think of the people I know mostly from the EA internet — people whose main engagement with the community comes from engagement on social media or the Forum — I see them express these kinds of feelings far more often. 

This makes sense — once you have a day job (and maybe a family), your focus is mostly on a few specific things, rather than "having as much impact as possible, generally". You're also able to accomplish a lot by just doing your job reasonably well (or helping to shepherd new life into the world!). By comparison, when EA feels like the biggest thing in your life and there's no clear "part of it" for which you are responsible, it's easier to feel like you should be doing everything, and harder to heed the messages about work/life balance that get shared in EA groups, at EA Global, etc. 

Another way to put it is that people feel more comfortable once they have a source of status, either within EA or outside of it. (Being a parent seems very high-status in the sense that most people will compliment you for doing it, gladly talk about parenting with you, etc. — plus you get to actually help another person every single day in a highly visible way, which is good for the soul.)

The result: to people who see the EA community mostly through its online presence, EA looks like it has a high proportion of people who seem burnt-out or unhappy, even if the on-the-ground reality is that most people are living emotionally ordinary lives.

The practical takeaways:

  • It's good to acknowledge that unhealthy norms do exist in some corners of EA, but I worry that confusing "what EA looks like online" with "how EA actually is on a population level" might lead us to throw too many resources or too much concern at problems that aren't as widespread as they appear. (This isn't to say that we've reached that point yet, or anything close to it, but I sometimes see takes like "EA should focus on taking care of burnt-out community members as an urgent priority" that seem to overstate how common this is.)
  • I'm also worried  that people who want to get more involved with EA will assume that their risk of being pulled into a pit of despair is much higher than it actually is, or think of the community as being a collection of sad, burnt-out husks (rather than what it actually looks like to me — a bunch of people who span the full spectrum of emotional states, but who seem somewhat happier and calmer than average).

more standard pattern of "work pretty hard for a pretty standard number of hours, mostly relax when not working". People at CEA get married, have kids, and take vacations at rates that seem normal to me.

(Obvious disclaimer: It's not always obvious when people are doing the work-until-you-drop thing. But in many cases, I'm also real-world or Facebook friends with these people and see them vacationing, partying, reading novels, sharing memes, and otherwise "indulging" in normal stuff.)

For what it's worth, my current impression is that "working a pretty standard number of hours (35-45?)" may well be the norm at CEA (and for that matter Rethink Priorities), but is not necessarily the norm at EA orgs overall.

From very close contact/intuitive tracking (e.g., people I live with during the pandemic who WFH), working >45h is the default for people I know at CHAI, Ought, Open Phil,  and Redwood Research and definitely  Alameda/FTX. I also believe this is true (with lower confidence) for Lightcone, and paid student organizers at places like SERI.

Interestingly enough, the non-EA STEM grad students at top universities who I can personally observe  do not seem to work more than the norm, and I don't have a strong sense of whether this is because my sample (n=3?) is skewed or too high-variance or because the polls are skewed.

I vaguely share your impressions that EA org people who are active socially/online (including myself) may on average be less hardworking, but I think there's a pretty intuitive story for the relevant self-selection biases.

I vaguely share your impressions that EA org people who are active socially/online (including myself) may on average be less hardworking

That seems like the opposite of my impression. My impression is that the majority of people in EA positions who are less active online are more likely to have normal work schedules, while the people who spend the most time online are those who also spend the most time doing what they think of as "EA work" (sometimes they're just really into their jobs, sometimes they don't have a formal job but spend a lot of time just interacting in various often-effortful ways).

Thanks for sharing your impression of people you know — if you live with a bunch of people who have these jobs, you're in a better position to estimate work time than I am (not counting CEA). When you say "working >45h", do you mean "the work they do actually requires >45 hours of focused time", or "they spend >45 hours/week in 'work mode', even if some of that time is spent on breaks, conversation, idle Forum browsing, etc."?

That seems like the opposite of my impression. My impression is that the majority of people in EA positions who are less active online are more likely to have normal work schedules, while the people who spend the most time online are those who also spend the most time doing what they think of as "EA work"

Sorry to be clear, here's my perspective: If you only observe EA culture from online interactions, you get the impression that EAs think about effective altruism much more regularly than they actually do. This will extend to activities like spend lots of time "doing EA-adjacent things", including the forum, EA social media, casual reading, having EA conversations, etc. Many people in that reference class include people volunteering their time, or people who find thinking about EA relaxing compared to their stressful day jobs.

However, if we're referring to actual amounts/hours of work done on core EA topics in their day job, EA org employees who are less active online will put in more hours towards their jobs compared to EA org employees who are more active online.

When you say "working >45h", do you mean "the work they do actually requires >45 hours of focused time", or "they spend >45 hours/week in 'work mode', even if some of that time is spent on breaks, conversation, idle Forum browsing, etc."?

They spend >>45h on their laptops or other computing devices, and (unlike me) if I glance over at their laptops during the day, it almost always appears to be a work-related thing like a work videocall or github, command line, Google Docs, etc. 

A lot of the work isn't as focused, e.g., lots of calls, management, screening applications, sys admin stuff, taking classes, etc.

My guess is that very few people I personally know spends >45h on doing deep focused work in research or writing. I think this is a bit more common in programming, and a lot more common in trading. I think for the vast majority of people, including most people in EA, it's both psychologically and organizationally very hard to do deep focused work for anywhere near that long. Nor do I necessarily think people should even if they could: often a 15-60 minute chat with the relevant person could clarify thoughts that would otherwise take a day, or much longer, to crystallize. 

But they're still doing work for that long, and if you mandate that they can only be on their work computers for 45h, I'd expect noticeable dips in productivity. 

Re:

the work they do actually requires [emphasis mine] >45 hours

Not sure what you mean by "requires." EA orgs by and large don't clock you, and there's pretty high individual variance in productivity. A lot of the willingness to work is self-imposed. I don't think this is abnormal. I think most people will have less output if they work a lot less, though of course there's large individual variance here and I can imagine negative marginal returns for some people. I can also buy that some/most people, even in that reference class, aren't very good at time management and can theoretically have more output on much less hours. But this is far from easy to do in practice. These are often very smart, dedicated, conscientious, people.

If you only observe EA culture from online interactions, you get the impression that EAs think about effective altruism much more regularly than they actually do. This will extend to activities like spend lots of time "doing EA-adjacent things", including the forum, EA social media, casual reading, having EA conversations, etc. Many people in that reference class include people volunteering their time, or people who find thinking about EA relaxing compared to their stressful day jobs.

I agree.

However, if we're referring to actual amounts/hours of work done on core EA topics in their day job, EA org employees who are less active online will put in more hours towards their jobs compared to EA org employees who are more active online.

I don't have an opinion about this either way. My argument was about people who were org employees vs. interested in doing EA work but not actually employees of EA orgs (the latter being the group somewhat more likely to talk online about feeling bad in the ways Sasha described).

By comparison, when EA feels like the biggest thing in your life and there's no clear "part of it" for which you are responsible, it's easier to feel like you should be doing everything, and harder to heed the messages about work/life balance...

"No clear part of it" = "no one job that belongs to you, so you may feel vaguely like you should be contributing to everything you see, or doing everything possible to figure out a career path until you find one".

"Requires" was imprecise language on my part — I just meant "they are actually working 45 hours, rather than just completing <45 hours of work in what looks like 45 hours." Your response satisfies me w/r/t people seeming to have more than a 45-hour workweek of "actual/required work".

Nice. What you wrote accords with my experience. In my own personal case, my relationship to EA changed quite substantially--and in the way you describe--when I transitioned from very online to being within a community.

I definitely feel this as a student. I care a lot about my impact and I know intellectually that being really good at being a student the best thing I can do for long term impact.  Emotionally though, I find it hard to know that the way I'm having my impact is so nebulous and also doesn't take very much work do well. 

I agree that a good number of people around EA trend towards sadness (or maybe "pits of despair"). It's plausible to me that the proportion of the community in this group is somewhat higher than average, but I'm not sure about that. If that is the case, though, then my guess is that some selection effects, rampant Imposter Syndrome, and the weight of always thinking about ways the world is messed up are more important causes than social norms. 

I have to say, I actually chuckled when I read "don’t ever indulge in Epicurean style" listed as an iron-clad EA norm. That, uhh, doesn't match my experience.

DC
2y5
0
0

I really appreciated this post and that it came at a good time for it. 

Most of it is just toxic social norms. These groups develop toxic social norms.

Idk how much I buy that these are "norms". I think most people who have been around for a while would strongly disagree with the notion that one should never stop thinking about their impact, for example. Yet it seems that enough (young) people have some relationship to these "norms" that they're worth talking about.

Here, I should probably stop and define toxic norms. I think a toxic norm is any rule where following it makes you feel like large parts of you are bad.

I talk about this a bit in a post I wrote recently on self-love. I think self-love is a much bigger deal than most people expect, and it can solve a lot of the problems discussed in this piece (on an individual level). 

I also talk about my experiences with this here, in response to Howie's comment on my self-love post. 

I don't know that I agree with the mechanisms Sasha proposes, but I buy a lot of the observations they're meant to explain. 

 

In particular, I don't think conferring status was a big part of the problem for me. It was more internal. The more dangerous thing was taking my beliefs really seriously, only allowing myself to care about saving the world and ruthlessly (and very naively) optimising for that, even to the detriment of things like my mental health.

Curated and popular this week
Relevant opportunities