I currently lead EA funds.

Before that, I worked on improving epistemics in the EA community at CEA (as a contractor), as a research assistant at the Global Priorities Institute, on community building, and Global Health Policy.

Unless explicitly stated otherwise, opinions are my own, not my employer's.

You can give me positive and negative feedback here.


Sorted by New
· · 1m read


Topic contributions

Answer by calebp19

Hi Markus,

For context I run EA Funds, which includes the EAIF (though the EAIF is chaired by Max Daniel not me). We are still paying out grants to our grantees — though we have been slower than usual (particularly for large grants). We are also still evaluating applications and giving decisions to applicants (though this is also slower than usual). 

We have communicated this to the majority of our grantees, but if you or anyone else reading this urgently needs a funding decision (in the next two weeks), please email caleb [at] effectivealtruismfunds [dot] org with URGENT in the subject line, and I will see what I can do. Please also include:

  • Please also include the name of the application (from previous funds email subject lines),
  • the reason the request is urgent,
  • latest decision and payout dates that would work for you - such that if we can’t make these dates there is little reason to make the grant.

You can also apply to one of Open Phil’s programs; in particular, Open Philanthropy’s program for grantees affected by the collapse of the FTX Future Fund may be particularly of note to people applying to EA Funds due to the FTX crash.

That all makes sense. I will very likely have a think about who to refer.

It's probably good to note that Manifold is not an EA organisation, and Manifest (iiuc) was not funded by EA funders or branded as an EA event (though I think Manifest was trying to channel some of the EA vibe, and many attendees were involved in EA in some way).

I think Oliver has talked with the writer on twitter already!

There are many factual inaccuracies in this post. Oliver Habryka (who runs this Lightcone) wrote a tweet thread explaining some of the most egregious errors.

For example, contrary to the article's claim no FTX funds were used in the purchase of Lighthaven. According to Kelsey Piper (a journalist at Vox) this "is a really, really bad mistake of the kind that should have been caught in even cursory fact checking".

I think that 80k doing strategic marketing like this sounds really interesting. I personally wasn’t excited about entering (despite being excited about the prize) because I’m pretty suspicious of “chance of a trip”, I think text like this is much more common when the chance is actually very low than when it’s high which makes the endeavour less worthwhile. I think if you’d given an explicit best guess probability that would have been helpful for me.

I’d guess that the vast majority of people who donate to give directly (including substantial public sums) would not describe themselves as EA and by their lights aren’t focussed on “doing the most good with impartiality and scope sensitivity in mind” so I wouldn’t describe them as an EA on this donation alone, if they talk about EA explicitly in the video it would be great to add that context above.

Obviously not “being an EA” does not diminish this achievement, I’m excited when anyone donates to help the poor in a cost-effective manner independent of their community affiliation and even more excited when it’s such a large amount of money!

Thanks for posting this here, it’s very exciting and I’m looking forward to watching the video!

I think that the boring answer for us not doing as much grantmaking in this area as in technical areas is just that we don't receive a very high number of applications - but this isn't clearly a bad thing; there are many excellent organisations that do great work in AI policy/governance/advocacy whilst there are only a handful of active organisations on the technical side. I often think that getting a role in an existing org is a better fit for many applicants than doing independent work or starting their own org and I am grateful that the ecosystem for AI policy/governance/advocacy is developed enough to onboard lots of junior people rather than them having to apply for grants to do independent work.

We are trying to do more grantmaking in this space, but unfortunately, the EA brand makes publicising the grants we do make difficult. Many of our grants would count as "fieldbuilding" for AI policy/governance/advocacy, but we could make this clearer in our descriptions. LTFF fund managers, in general, are very excited about work in this area. Even if we can't fund it directly, we often try to refer it to other funders to fund, so I'd definitely encourage people to apply.

That makes sense. I currently believe that the grantee did honour their commitment re hours spent on this project, and if I came to believe otherwise I would be much more inclined to claw back funding.

(You didn't explicitly make this claim, but I'd like to push back somewhat on people with unsuccessful longtermist projects "slacking off". In general, my impression from speaking to grantees (including those with failed projects) is that they are overworked rather than underworked relative to "normal jobs" that pay similarly or are similarly challenging/selective.)

This isn't really against Zach’s point, but presumably, a lot of Greg's motivation here is signalling that he is serious about this belief and having a bunch of people in this community see it for advocacy reasons. I think that taking out a loan wouldn't do nearly as well on the advocacy front as making a public bet like this.

Load more