I have a friend who is casually engaged with EA. They went on https://funds.effectivealtruism.org/grants and was browsing some of the grants and asking me about them. I get the impression this was well-intentioned curiosity, but I was at a loss to explain the dollar amount of some of the grants my friend pointed out to me, in light of the short "project description" provided. 

The last thing I want to do in this post is call anyone out-- I'm sure the rationale behind these grants was sound, and there is relevant information missing from the provided description-- but I was surprised to find out that there are grants for university group organizing in the five and six figures, and some of these are not even for an entire year. I do think this is something that will (perhaps justifiably) raise eyebrows for the average person, if they are just learning about EA as a movement very focused on cost effectiveness, and haven't yet internalized some of the expected value calculations that probably went into these grants. But also, in a couple cases, I personally am having a hard time imagining how these numbers make sense. 

If you are reading this post and willing to comment-- could you (1) help me make sense of these grants first for myself and (2) provide any pointers on how to explain them to someone who isn't yet totally onboard with EA? I don't want to indicate specific grants, but specifically, what e.g. is the argument for a 5 or 6 figure grant for one semester of university organizing at a specific school? I don't understand how so much money could be needed. As far as I'm aware, most organizers are volunteers (but maybe that is changing?). Happy to take this to a private conversation if that would be more appropriate.

79

0
0

Reactions

0
0
Comments18
Sorted by Click to highlight new comments since: Today at 5:27 PM
Evie
2y38
4
1

(Removed this comment. Don't know how to delete it.)

Evie
2y16
1
0

I’ve been considering writing a post about my experience of receiving a grant, and the downsides I didn’t anticipate beforehand. It would probably look similar to this comment but with more info.

I imagine that such a post could be quite helpful for other young people who are considering applying for funding, and it could also be helpful for other people to understand more of this "ecosystem." I, for one, would be interested to read your story.

Thanks for sharing your experience. I'm sure I would have also felt shame and guilt if I were in your situation, though obviously this is not what we want to happen!

My general feeling about situations like this is that there are some grants that are better off not being shared publicly, if the context allows for it (this depends on many complex social factors). Wealthy people spend money on all kinds of outlandish things all over the world yet receive comparably little opprobrium simply because this spending is rarely public. It's unfair for you to be exposed to the vitriol from regular people expressing their frustration with inequality.

I'm reluctant to say too much about your particular circumstance (given I don't have context, and this is quite a personal thing), but I think if it were me, I might look for ways to tactfully omit discussion of the grant when first getting to know non-EAs socially. Not because it *is* shameful but just because it may unconsciously make some people uncomfortable. If it does come up, I think there is a way to "check your privilege" while also expressing confidence that you did nothing wrong. I've found in my experience, ironically, if I express contrition about something, people are more likely to actually think I did something shameful. Whereas if I sound confident, they tend to have a positive impression of me. These aren't necessarily bad people, that's just how humanity is.

While socializing with EAs is wonderful, I agree that it is better to have a diverse social circle including non EAs too!

This could be titled as "The curse of non-consequentialist ethics plus social media means that there is no reasonable way to prioritize what matters, and the news contributes to that by essentially equalizing all crises under similar names, especially in the headline."

A bit of a sidestep but there there is also the new Longtermism Fund , for more legible longtermist donations that are probably easier to justify.

I think it's easy to miss the forest for the trees. Unless I've missed something:

  1. Before 2022, all of EA outreach/infrastructure funding in total have cost <<$200M,
    1. It's likely <100M, but it's hard to tell because some funding programs blur the lines between outreach/infra and direct work, e.g. paying for someone's PhD program.
    2. Notably this is lower than Open Phil's spending on criminal justice reform.
  2. EA outreach funding has likely generated substantially >>$1B in value, and
  3. EA outreach is an area where we expect there to be significant lags to happen between spending and impact
    1. For example, Sam Bankman-Fried graduated MIT 8 years ago[1].

Raising the saliency of our moral obligations and empirical worldviews on the ways to do good to future billionaires and future top researchers (or current billionaires and current top researchers) is by its very nature an extremely hits-based proposition. 

If you're only looking at a budget very loosely, it seems silly to complain about hundreds of thousands of dollars of spending when billions of dollars of foregone opportunities is on the line.

Now, if you're looking at budgets in detail and investigating programs closely, I think it's reasonable to be skeptical of some types of spending (e.g. if people are overpaid or eating overly fancy food or or not trying to save money on flights or whatever). It's probably even more important to be skeptical of weak theories of change, or poor operational execution. 

  1. ^

    I think SBF donating to utilitarian/LT stuff is probably overdetermined, so not something that we can say EA outreach was useful for. However, I do not think this is true for everyone who's extremely high impact. I think one of the strongest cruxes for value of EA outreach etc is whether or not "our top people" are drawn to EA without needing any active outreach. Current evidence suggests the outreach is pretty relevant.

"EA outreach funding has likely generated substantially >>$1B in value"

Would be curious how you came up with that number. 

It was a very quick lower bound. From the LT survey a few years ago, basically about ~50% of influences on quality-adjusted work in longtermism were from EA sources (as opposed to individual interests, idiosyncratic non-EA influences, etc), and of that slice, maybe half of that is due to things that look like EA outreach or infrastructure (as opposed to e.g. people hammering away  at object-level priorities getting noticed).

And then I think about whether I'd a) rather all EAs except one disappear and have 4B more, or b) have 4B less but double the quality-adjusted number of people doing EA work. And I think the answer isn't very close.

I'm not involved with EA funds, but some university group organizers have taken semesters of leave in the past to do group organizing full time for a semester. If you assume their term is 14 weeks, then that's 14*40=560 hours of work. At $20/hr, that's more than $10,000. And I think it is pretty reasonable to request more than $20/hr (various funding sources have previously offered something like $30/hr).

In general, nowadays, many group organizers are not volunteers and are paid for their part time work (if they are not full time, this shouldn't amount to five figures for one semester though). I think this is a good thing, since many university students simply cannot afford to take a volunteer job with a commitment of 10+ hours per week, and I wouldn't want EA groups only run by people who are rich enough that that's feasible.

The numbers that I am confused about are in the high 5 figures and low 6 figures, about an order of magnitude bigger than $10,000. I don't think assuming a salary of $30/hour helps me understand or explain these numbers. I brought up volunteering vs. paid work in the OP, and I think this was probably misleading-- sorry about that.

However, on that point:

I agree that we don't want EA groups to only be run by the financially privileged. But this concern needs to be balanced against the fact that EA in general, and EA university group organizing in particular (probably) already selects for high SES people, and there may be better ways of making participation in EA accessible to everyone. There is already some level of SES barrier for college students maneuvering themselves into a position to start receiving funding for this work, so you are already getting a filtered sample by the time the money starts flowing. This is a difficult problem to solve, but I hope people are conscientious of it. 

Yeah I wasn't sure which grants you were referring to (haven't looked through them all), but indeed that doesn't seem to be explained by what I said.

I agree that EA already selects for high SES people and that offering funding for them to organize a group doesn't negate this problem. Other steps are also needed. However, I know quite a few anecdotal cases of group organizers being able to organize more than they otherwise would have because they were being paid, and so this policy does concretely make some difference.

If you PM me I'm happy to send you the list. Like I said in the post, I don't believe it would be productive to post it publicly.

I think this is a wise decision, and I disagree with those claiming that publicly criticising grant receipts is a good idea.

I think I want to take the side of "public criticisms are good." I think past examples in this genre (e.g. the Larks' AI alignment Literature Review and Charity Comparison and Nuño's 2018-2019 LTFF Grantees: How did they do?) were substantially net positive.

Good question, and I think this is definitely healthy discussion. In general, money is a sensitive issue, and I would encourage all parties to show nuance, this includes (but not limited to) when: "judging" someone's salary, when asking for a salary, and when granting a salary.

Two steelmen for decent chunky grants 1) Bounded loss and unbounded wins - while theoretically salaries could be cut in half, impact could easily be 10-100x. I.e. the focus should be opportunity cost and not expenditure 2) many smart people in ea, and the people granting, may have previously been earning decent significant salaries as programmers/executives/consultants. You and I may see 80k USD as a lot of money, but its pretty normal for developers in Cali to earn hundreds of USD. Therefore, expecting people to earn 50k a year may effectively be asking them to donate 75% of their income.

And 2 steelmen for keeping salary low - 1) this is a movement about charity, helping others, and donating. We put a lot of effort and time into building health communities around these principles built on a heavy basis of trust. It's important to feel like people are in it for the right reason, and high salaries can jeopardise that. 2) it's pretty easy to justify a high salary with some of the above reasoning, perhaps too easy. As a community builder myself, it seems totally plausible we could attract people that are a poor fit for ea by being too relaxed around the money pedal.

For my own personal opinion, I think it's far too easy to ignore opportunity cost, and concentrate on short term expenditure and salary. However, I can very much imagine myself leaving the community if I salaries became too inflated. And I am likely to feel less aligned with others who require large salaries (just being honest here). Looking at recent posted receipts, I don't see anything that catches my eye in a bad way, although it could be said to be unfair that some community builders will be working 3x harder on a volunteer basis than other community builders on a competitive salary. I think this partially reflects the incentives which produced a world we currently live in (I.e. largely unaltruistic).

Whilst I find the arguments for working hard, and concentrating on impact, rather than earning little, pretty compelling - it's worth pointing out that there's some fantastic work coming out of Charity Entrepreneurship charities (who's employees generally earn little) , so it's not clear the tradeoff is always present.

Lastly, I would say its likely that I've made tradeoffs with my own salary, which have likely significantly negatively effected my social impact. I suspect this is easy to do, and would encourage people to avoid failing into this trap.

If you search the forum for the EAIF tag you can get some more details on past grants.  I'm not sure if this gives you quite what you're looking for, or not.

https://forum.effectivealtruism.org/topics/effective-altruism-infrastructure-fund?sortedBy=magic

I haven't browsed the grants in much detail myself, but I would default to trying to explain EA's culture of thoroughness by reference to e.g. GiveWell's detailed evaluations of various charities, and say "this is more depth than most grants go into but it sets the tone of the sorts of things people tend to look for".

You could also point out common biases that the person might be falling for. One thing I would be inclined to explain in particular is the bikeshedding bias (https://thedecisionlab.com/biases/bikeshedding) -- it's much easier to critique things we understand. The simplest-looking grants (like university group support) are ones which I can imagine are particularly subject to bikeshedding.

Another thing I would be inclined to explain is the idea of continuing to invest (possibly exponentially) in things that work. e.g., if some intervention has shown that they made good use of $10k in the past, maybe try giving them $100k and see if they can do 10x as much good, or close to it. A related bias is the absurdity heuristic (e.g. ruling good ideas out because "they seem kind of crazy").

Curated and popular this week
Relevant opportunities