MaxDalton

I lead the Centre for Effective Altruism. I used to be a moderator here, and helped to launch the new version of the Forum in 2018.

Feel free to reach out if you think I might be able to help you. Follow the links to give (anonymous) feedback to me or CEA.

How others can help me

How I can help others

  • I might be able to help you talk through issues you're facing relating to management, and I can point you towards some good resources if you're new to management.
  • I might be able to fix issues you see in CEA's work.

Sequences

CEA Updates (Q3 2021)

Topic Contributions

Comments

"Big tent" effective altruism is very important (particularly right now)

Agree that echo chamber/dogmatism is also a major barrier to epistemics!

"30% seems high by normal standards" - yep, I guess so. But I'm excited about things like GWWC trying to grow much faster than 30%, and I think that's possible.

Agree it's not fully within our control, and that we might not yet be hitting 30%. I think that if we're hitting >35% annual growth, I would begin to favour cutting back on certain sorts of outreach efforts or doing things like increasing the bar for EAG. I wouldn't want GW/GWWC to slow down, but I would want you to begin to point fewer people to EA (at least temporarily, so that we can manage the growth). [Off the cuff take, maybe I'd change my mind on further reflection.]

"Big tent" effective altruism is very important (particularly right now)

FWIW, I do think that I reacted to this a bit differently because it's Luke (who I've worked with, and who I view as a peer). I think I would have been more positive/had lower standards for a random community member.

"Big tent" effective altruism is very important (particularly right now)

Ah, I think I was actually a bit confused what the core proposition was, because of the different dimensions.

Here's what I think of your claims:

a) 100% agree, this is a very important consideration.

b) Agree that this is important. I think it's also very important to make sure that our shop fronts are accurate, and that we don't importantly distort the real work that we're doing (I expect you agree with this?).

c) I agree with this! Or at least, that's what I'm focused on and want more of. (And I'm also excited about people doing more cause-specific or community building to complement that/reach different audiences.)

So maybe I agree with your core thesis!

How easy is it to get big with evidence and reasoning?

I want to distinguish a few different worlds:

  1. We just do cause specific community building, or action-specific community building.
  2. We do community building focused on "EA as a question" with several different causes. Our epistemics are decent but not amazing.
  3. We do community building focused on "EA as a question" with several different causes. We are aiming for the epistemics of core members to be world class (like probably better than the average on this Forum, around the level that I see at some core EA organizations).

I'm most excited about option 3. I think that the thing we're trying to do is really hard and it would be easy for us to cause harm if we don't think carefully enough.

And then I think that we're kind of just about at the level I'd like to see for 3. As we grow, I naturally expect regression to the mean, because we're adding new people who have had less exposure to this type of thinking and may be less inclined to it. And also because I think that groups tend to reason less well as they get older and bigger. So I think that you want to be really careful about growth, and you can't grow that quickly with this approach.

I wonder if you mean something a bit more like 2? I'm not excited about that, but I agree that we could grow it much more quickly.

I'm personally not doing 1, but I'm excited about others trying it. I think that, at least for some causes, if you're doing 1 you can drop the epistemics/deep understanding requirements, and just have a lot of people coordinate around actions. E.g. I think that you could build a community of people who are earning to give for charities, and deferring to GiveWell and OpenPhilanthropy and GWWC  about where they give. I think that this thing could grow at >200%/year. (This is the thing that I'm most excited about GWWC being.) Similarly, I think you could make a movement focused on ending global poverty based on evidence and reasoning that grows pretty quickly - e.g. around lobbying governments to spend more on aid, and spend aid money more effectively. (I think that this approach basically doesn't work for pre-paradigmatic fields like AI safety, wild animal welfare, etc. though.)

"Big tent" effective altruism is very important (particularly right now)

Nice to see you on the Forum again! 

Thanks for sharing that perspective - that makes sense. Possibly I was holding this to too high a standard - I think that I held it to a higher standard partly because Luke is also an organization/community leader, and probably I shouldn't have taken that into account. Still, overall my best guess is that this post distracted from the conversation, rather than adding to it (though others clearly disagree). Roughly, I think that the data points/perspectives were important but not particularly novel, and that the conflation of different questions could lead to people coming away more confused, or to making inaccurate inferences. But I agree that this is a pretty high standard, and maybe I should just comment in circumstances like this.

I also think I should have been more careful re seeming to discourage suggestions about EA. I wanted to signal "this particular set of suggestions seems muddled" not "suggestions are bad", but I definitely see how my post above could make people feel more hesitant to share suggestions, and that seems like a mistake on my part. To be clear: I would love feedback and suggestions!

"Big tent" effective altruism is very important (particularly right now)

I'm sorry that it came off as dismissive. I'll edit to make clearer that I appreciate and value the datapoints and perspectives. I am keen to get feedback and suggestions in any form. I take the datapoints and perspectives that Luke shared seriously, and I've discussed lots of these things with him before. Sounds like you might want to share your perspective too? I'll send you a DM.

I viewed the splitting out of different threads as a substantive contribution to the debate, but I'm sorry you didn't see it that way. :) I agree that it would have been better if I'd given my take on all of the dimensions, but I didn't really want to get into all of those threads right now.

"Big tent" effective altruism is very important (particularly right now)

Thanks for writing this up Luke! I think you're pointing to some important issues. I also think you and the GWWC team are doing excellent work - I'm really excited to see more people introduced to effective giving!

[Edit to add: Despite my comment below, I still am taking in the datapoints and perspectives that Luke is sharing, and I agree with many of his recommendations. I don't want to go into all of the sub-debates below because I'm focused on other priorities right now (including working on some of the issues Luke raises!).]

However, I worry that you're conflating a few pretty different dimensions, so I downvoted this post.

Here are some things that I think you're pointing to:

  1. "Particular set of conclusions" vs. "commitment to using evidence and reasoning"
  2. Size of the community, which we could in turn split into
    1. Rate of growth of the community
    2. Eventual size of the community
  3. How welcoming we should be/how diverse
    1. [I think you could split this up further.]
  4. In what circumstances, and to what degree, there should be  encouragement/pressure to take certain actions, versus just presenting people with options.
  5. How much we should focus on clearly communicating EA to people who aren't yet heavily involved.

This matters because you're sometimes then conflating these dimensions in ways that seem wrong to me (e.g. you say that it's easier to get big with the "evidence and reasoning" framing, but I think the opposite). 

"Big tent" effective altruism is very important (particularly right now)

+1 to this.

In fact, I think that it's harder to get a very big (or very fast-growing) set of people to do the "reason and evidence" thing well.  I think that reasoning carefully is very hard, and building a community that reasons well together is very hard.

I am very keen for EA to be about the "reason and evidence" thing, rather than about specific answers.  But in order to do this, I think that we need to grow cautiously (maybe around 30%/year) and in a pretty thoughtful way.

EA is more than longtermism

Hey, I've just messaged the people directly involved to double check, but my memory is that we did check in with some non-longtermists, including previous critics (as well as asking more broadly for input, as you note). (I'm not sure exactly what causes the disconnect between this and what Aaron is saying, but Aaron was not the person leading this project.) In any case, we're working on another update, and I'll make sure to run that version by some critics/non-longtermists.

Also, per other bits of my reply, we're aiming to be ~70-80% longtermist, and I think that the intro curriculum is consistent with that. (We are not aiming to give equal weight to all cause areas, or to represent the views of everyone who fills out the EA survey.) 

Since the content is aiming to represent the range of expert opinion in EA, since we encourage people to reflect on the readings and form their own views, and since we asked the community for input into it, I think that it's more appropriate to call it the "EA Handbook" than the previous edition.

EA is more than longtermism

I agree that all sorts of selection biases are going to be at play in this sort of project: the methodology would be a minefield and I don't have all the answers.

I agree that there's going to be a selection bias towards people who think cause prio is hard. Honestly, I guess I also believe that ethics is hard, so I was basically assuming that worldview. But maybe this is a very contentious position? I'd be interested to hear from anyone who thinks that cause prio is just really easy.

More generally, I agree that I/CEA can't just defer our way out of this problem or other problems: you always need to choose the experts or the methodology or whatever. But, partly because ethics seems hard to me, I feel better about something like what I proposed, rather than just going with our staff's best guess (when we mostly haven't engaged deeply with all of the arguments).

Load More