burner

423Joined Oct 2021

Comments
16

I think it is very difficult to litigate point three further without putting certain people on trial and getting into their personal details, which I am not interested in doing and don't think is a good use of the Forum. For what it's worth, I haven't seen your Twitter or anything from you.

I should have emphasized more that there are consistent critics of EA who I don't think are acting in bad faith at all. Stuart Buck seems to have been right early on a number of things, for example. 

Your Bayesian argument may apply in some cases but it fails in others (for instance, when X = EAs are eugenicists).

Just apply Bayes' rule: if P(events of the last week | X) > P(events of the last week | not-X), then you should increase your credence in X upon observing the events of the last week.

I also emphasize there are a few people who I have strong reason to believe are "deliberate effort to sow division within the EA movement" and this was the focus of my comment, publicly evidenced (NB: this is a very small part of my overall evidence) by them "taking glee in this disaster or mocking the appearances and personal writing of FTX/Alameda employees." I do not think a productive conversation is possible in these cases. 

These areas all seem well-identified, but the essential problem is that EA doesn't have near the sufficient talent for top priority causes already. 

I don't think that the EA community profits itself by not including artists and those with skills that aren't squarely in the conventional Earning to Give purview.

I certainly wouldn't claim this. Obviously art, in general, is ex ante a very unpromising earning to give path. My suggestion is that we should encourage artists to use their skills in high impact ways.

I don't buy the premise that this is not high EV through a combination of direct impact and promoting a model that is potentially high EV.

This implies a very weird model. Why would you think this is high EV? Presumably things are neutral to low EV unless proven otherwise via research? Nothing about "a combination of direct impact (??) and promoting a model" innately suggest high EV-which recall is a very high bar for career paths. 

I'm really glad OP is excited to help out, but we should encourage them to consider whether they could do more good given their skill set. That is, after all, the point of EA. Many EA orgs  need help with brand and aesthetics for example. Maybe their skills would be a good fit. 

I'm not suggesting he shouldn't advertise that he will donate profits. I'm suggesting he could do something more lucrative. 

I strongly suspect the reason EA is so obsessed (read: dependent) on high agency operations types is because of a lack of managerial talent

I don't get it. Why don't you try to earn more money if you are going to give it away?

I have spent some time in and around Sandusky. I think you might be vastly overselling it in terms of general niceness and amenities. 

Firstly, I want to address why effective altruism, as I’ve stated elsewhere, “cannot singlehandedly meet the civil purpose of philanthropy.”

I think Nadia is misreading EA as a fundamentally philanthropic movement. EA is about maximizing the amount of good we do. Longtermism is about maximizing the EV of the future. Philanthropy is part of that, but far from the whole picture. Neither have made any claims about fulfilling the civil purpose of philanthropy-which I take it is something like libraries and children's hospitals. In their more extreme forms, EA and longtermism may claim on the margin they are more important than those things, but not that they meet the same purpose. 

I enjoy the piece, but do think it misses the mark in its comments on EA.  

Load More