Hide table of contents

Now that we know "that it looks likely that there are many committed grants that the Future Fund will be unable to honor" according to the former team of FTX Future Funds, it would be useful for a number of us to have alternatives.

Current large funders are re-considering their grant strategy (such as OpenPhil). These donors are going to be extremely busy in the next few weeks. If you're not too funding-constrained, it seems like a good and pro-social strategy is to wait until after the storm and let others who have urgent and important grants figure things out.

However, it seems that this may not be true if you are very funding-restrained right now, or if some grants or fellowships have deadlines coming up soon that it may be useful to have on your radar.

So, what are good places to apply for funding now? (and in the future too)

To start, the FLI AI Existential Safety PhD Fellowship has a deadline on November 15. 

The Vitalik Buterin PhD Fellowship in AI Existential Safety is for PhD students who plan to work on AI existential safety research, or for existing PhD students who would not otherwise have funding to work on AI existential safety research. It will fund students for 5 years of their PhD, with extension funding possible. At universities in the US, UK, or Canada, annual funding will cover tuition, fees, and the stipend of the student's PhD program up to $40,000, as well as a fund of $10,000 that can be used for research-related expenses such as travel and computing. At universities not in the US, UK or Canada, the stipend amount will be adjusted to match local conditions. Fellows will also be invited to workshops where they will be able to interact with other researchers in the field.  Applicants who are short-listed for the Fellowship will be reimbursed for application fees for up to 5 PhD programs, and will be invited to an information session about research groups that can serve as good homes for AI existential safety research.

More about the fellowship here.

What are other options?


 

New Answer
New Comment

3 Answers sorted by

[Link] The National Science Foundation has recently announced a $20 million grant pool for AI safety research, mostly in the areas of monitoring and robustness. Grants of up to $800,000 are available for researchers. First deadline May 26 2023; second deadline January 16 2024. (h/t CAIS)

Some more on the EA Opportunities board if you filter for "Opportunity Type > Funding."

(Note that you can also submit opportunities.)

More opportunities:

  • The AI Safety Microgrant Round: "We are offering microgrants up to $2,000 USD with the total size of this round being $6,000 USD"; "We believe there are projects and individuals in the AI Safety space who lack funding but have high agency and potential."; "Fill out the form at Microgrant.ai by December 1, 2022."
  • Nonlinear Emergency Funding: "Some of you counting on Future Fund grants are suddenly finding yourselves facing an existential financial crisis, so, inspired by the Covid Fast Grants program, we’re trying something similar for EA. If you are a Future Fund grantee and <$10,000 of bridge funding would be of substantial help to you, fill out this short form (<10 mins) and we’ll get back to you ASAP." 
More from Caro
Curated and popular this week
Relevant opportunities