295Joined Jan 2022


Thanks for writing this. Do you have any advice on getting a financial advisor? I've been wanting to hire one as a one-off to check I'm doing everything right. But not sure how to find a good person

Why is big tent EA an end in itself? The EA movement exists for the purpose of doing good, not for having a movement. If multiple smaller movements are more effective at doing good then we should do that. 


Multiple groups make it easier to specialise and avoid having single points of failure. Though you lose some economies of scale and coordination benefits. 

I don't want to throw cold water on your enthusiasm. But I think you are underestimating the difficulty of getting anything potentially politically controversial published in China in the current climate and the potential downside risks of coming to the attention of the Chinese government in such an area. 

Given the recent crackdowns on NGOs and civil society in China this would entail a very genuine risk of the related organisations being banned from operating in China, and make the government more likely to suppress EA idea in general. Which is a very high risk for the low odds of a single book meaningfully changing public opinion, and which is very unlikely to be published. 

I've had similar feelings to Alice. Part of it is that group membership serves a role of signalling information about yourself to others. Its very different to describe yourself to others as an EA when the primary association with it is "slightly weird but well meaning group of charitable people" vs when its "those weird crypto/eugenics people". And in the latter case you are better off moving to labelling yourself as something else 

EA seems to have a bit of a "not invented here" problem, of not taking onboard tried and tested mechanisms from other areas. E.g. with the boring standard conflict of interest and transparency mechanisms that are used by charitable organisations in developed countries. 

Part of this seems to come from only accepting ideas framed in certain ways, and fitting cultural norms of existing members. (To frame it flippantly, if you proposed a decentralised blockchain based system for judging the competence of EA leaders you'd get lots of interest, but not if you suggested appointing non-EA external people to audit.)

There might be some value to posts taking existing good practices in other domains and presenting them in ways that are more palatable to the EA audience, though ideally you wouldn't need to.

My own observation has been that people are open to intellectual discussion (your discounting formula is off for x reasons) but not to more concrete practical criticism, or criticism that talks about specific individuals. 

I share this feeling. I feel like EA has trended in the direction of some other groups I've dealt with where the personalities and interpersonal issues of a small number of people at the top come to be overly dominant. 

I've also had my faith in the movement fractured a bit by seeing how much of how things were run seems to be based on friends of friends networks. I had naively assumed they were doing the kind of due diligence and institutional division of power that other charitable organisations do.

A lot of this isn't a particular specific set of issues, but its a general sense of ones estimates of people being shifted downward

There's a general lack of competence in (and at times active disdain for) skills in PR and communications in EA. Which for a movement that wants to convince people of things and attract membership seems problematic

Yeah, lot of the issues in EA are things I recognise from other fields that disproportionately hire academic high achievers straight out of college, who don't have much real world experience, and who overestimate the value of native intelligence over experience. But conveying the importance of that difference is difficult as, ironically, its something you mostly learn from experience. 

I agree that people shouldn't think that way, but observably they do. And acknowledging human irrationality and working around it was the founding insight or rationalism and EA. I honestly can't really respond to most of your first two paragraphs since it seems to be based on the idea we shouldn't even be considering the question.

I'm not saying truth doesn't matter (if it came across that way I apologise) but that reputational effects are real and also matter. Which is very different from the strawman position of "we shouldn't do anything at all odd or unpopular".

Truth matters, and the hearts and minds we want to win should heavily skew toward those who care about truth, and not just on what things look like to hypothetical third parties.

I disagree with this fundamentally. Its short sighted to narrow down the people we want to persuade to being only a certain set of people. The donations and other contributions of everyone are equally valuable. And the general perception of EA effects people's likelihood to learn more to begin with.

These are not hypothetical people either. This and FTX are the main stories people are discussing online in relation to EA, and therefore what comes up when people initially do searches looking into it. And if someone's first impression is negative they are less likely to find out more and more likely to dismiss the movement.

To narrow down our disagreement a bit. Is your position a) this won't have reputational effects on EA b) there will be reputational effects but they won't decrease recruitment and donations or c) even if it does decrease recruitment and donations we shouldn't care about that.

Load More