Hide table of contents

I'm curious[1] if people observe any differences of opinion that seem to come up a bit too often to be random. I think I've noticed at least three - my impression is that, compared to EAs, rationalists on the whole:

  1. Care less about the welfare of nonhumans
  2. Have more faith in unvarnished honesty as a mechanism for collective truth-seeking
  3. Are more open-minded about weird or controversial ideas

I'd love to hear if anyone disagrees or has other observations.[2]

 

  1. ^

    I expect if I'd had a better sense of these kind of differences earlier on, it would have significantly affected my confidence in certain beliefs, the way I defer and the degree to which I engaged with each community.

  2. ^

    Be nice! I'd like this to be about noticing differences, not judging them.

13

0
0

Reactions

0
0
New Answer
New Comment

4 Answers sorted by

Philosophically, I think rationalists relatively:

  • Tend to be more focused on complexity of value; hedonic utilitarianism is rarer among them
    • This is reflected in both differing visions of the future, and being less drawn to animal welfare.
  • Tend to have a more continuous + computational view of personal identity
    • This is reflected in e.g., interest in antiaging and cryonics
  • Are less likely to be causal decision theorists and/or think on the margin

Sociologically, I think they:

  • Tend to be more neurodiverse (particularly on the autism spectrum)
    • Also have a greater incidence of mental health issues
  • Have greater class diversity (especially when you look at the current ~20 year olds in EA, I feel like they almost uniformly come from elite colleges, at least in the US)
  • Are more male
  • Are more white
  • Rely more on first-principles thinking and verbal logic, less on empirical data and fast BOTECs
  • Are less deferential
  • Are less prestige-conscious
  • Are more intelligence-conscious 

These are excellent, cheers!

Although I would have said "less prestige-seeking; more prestige-conscious" (as in, they talk about it a lot, but tend to be quite scrupulous about not seeking it for themselves and are sceptical of its usefulness)

The Rationality Community has a far lower focus on morality, and has members which are amoral or completely selfish. I'll go out on a limb and claim that they also have a broader set of interests, since there is less of a restriction on what attention can be focused on (EA wants to do good, the rationality community is interested in truth, and truth can be found basically about anything).

Purely from online observation (and I'll admit I'm biased towards the EA side of things here):

  1. EA is less reliant on "the sequences" and other pop-science blog posts when supporting their claims
  2. EA tends to be more deferential to expert opinion  and academic research when evaluating claims. 
  3. rationalists tend to have higher estimates of AI doom chances.
  4. EA tends to be more sympathetic to social justice issues. 

Any rationalists around who could answer this question? Crosspost to lesswrong?

Consider myself more culturally rationalist than EA, my (short) answer above. The real answer is 10k words and probably not worth the effort per insight/importance.

2
NickLaing
8mo
Thanks good to know, Love the rationalist answer :)
Comments2
Sorted by Click to highlight new comments since: Today at 8:22 AM

I'm interested to hear why you're asking this question. How would this affect your confidence in certain beliefs and they way you defer?

I've become much more familiar with EA, historically I've consider the two communities to be similarly rational and I thought the two were generally a lot more similar in their beliefs than I do now.

So when I learn of a difference of opinion, I update my outside view and the extent to which I consider people the relevant experts. E.g., when I learn that Eliezer thinks pigs aren't morally relevant because they're not self-aware, I lose a bit of confidence in my belief that pigs are morally relevant and I become a bit less trustful that any alignment 'solutions' coming from the rationalist community would capture the bulk of what I care about.

Curated and popular this week
Relevant opportunities