John Halstead - Independent researcher. Formerly Research Fellow at the Forethought Foundation; Head of Applied Research at Founders Pledge; and researcher at Centre for Effective Altruism. DPhil in political philosophy from Oxford
do the votes mean that it would be undemocratic to impose democratic rule?
Thanks for the detailed response.
I agree that we don't want EA to be distinctive just for the sake of it. My view is that many of the elements of EA that make it distinctive have good reasons behind them. I agree that some changes in governance of EA orgs, moving more in the direction of standard organisational governance, would be good, though probably I think they would be quite different to what you propose and certainly wouldn't be 'democratic' in any meaningful sense.
"Finally, we are not sure why you are so keen to repeatedly apply the term “left wing environmentalism”. Few of us identify with this label, and the vast majority of our claims are unrelated to it." Evidently from the comments I'm not the only one who picked up on this vibe. How many of the authors identify as right wing? In the post, you endorse a range of ideas associated with the left including: an emphasis on identity diversity; climate change and biodiversity loss as the primary risk to humanity; postcolonial theory; Marxist philosophy and its offshoots; postmodernist philosophy and related ideas; funding decisions should be democratised; and finally the need for EA to have more left wing people, which I take it was the implication of your response to my comment.
If you had spent the post talking about free markets, economic growth and admonishing the woke, I think people would have taken away a different message, but you didn't do that because I doubt you believe it. I think it is is important to be clear and transparent about what your main aims are. As I have explained, I don't think you actually endorse some of the meta-level epistemic positions that you defend in the article. Even though the median EA is left wing, you don't want more right wing people. At bottom, I think what you are arguing for is for EA to take on a substantive left wing environmentalist position. One of the things that I like about EA is that it is focused on doing the most good without political bias. I worry that your proposals would destroy much of what makes EA good.
I see. I wasn't being provocative with my question, I just didn't get it
You should probably take out the claim that FLI offered 100k to a neo nazi group as it doesn't seem to be true
I'm somewhat confused as to why this is controversial. Why is it news that FLI didn't make a grant to a far right org?
I appreciate you taking the effort to write this. However, like other commentators I feel that if these proposals were implemented, EA would just become the same as many other left wing social movements, and, as far as I can tell, would basically become the same as standard forms of left wing environmentalism which are already a live option for people with this type of outlook, and get far more resources than EA ever has. I also think many of the proposals here have been rejected for good reason, and that some of the key arguments are weak.
Overall, we need to learn hard lessons from the FTX debacle. But thus far, the collapse has mainly been used to argue for things that are completely unrelated to FTX, and mainly to an advance an agenda that has been disfavoured in EA so far, and with good reason. For Cowen, this was neoliberal progress, here it is left wing environmentalism.
What do you make of the 'impatient philanthropy' argument? Do you think EAs should be borrowing to spend on AI safety?
The claim in the post (which I think is very good) is that we should have a pretty strong prior against anything which requires positing massive market inefficiency on any randomly selected proposition where there is lots of money money on the table. This suggests that you should update away from very short timelines. There's no assumption that markets are a "mystical source of information" just that if you bet against them you almost always lose.
There's also a nice "put your money where you mouth is" takeaway from the post, which AFAIK few short timelines people are doing.
I strongly agree with a lot of your points here. To pick up on one strand you highlight, I think the fact that EA is very nerdy, and lacking 'street smarts' has been at the root of some (but not all) of the problems we've been seeing. I think it might be this rather than an intellectual commitment to assume good faith and tolerate weirdness that is the main issue, though maybe the first causes the second. Specifically, EAs seem to have been pretty naive in dealing with bad actors over the last few years and that persists to this day.
If the problem is lack of street smarts, then we don't need to get into debates about being less weird because it's kind of unclear what it means, and hard to judge what margin of weirdness you want to move, which makes general debates about weirdness difficult. But it's pretty clear that we need to be more street smart.