This is a quickly written post listing opportunities for people to apply for funding from funders that are part of the EA community.
Update: Effective Thesis is maintaining a more informative and up-to-date Airtable version of this list of EA funding opportunities. You can view that here. If you want to comment on the Airtable please click here (please note other commenters will be able to see your email address). You can also suggest new funding opportunities here.
I strongly encourage people to consider applying for one or more of these things. Given how quick applying often is and how impactful funded projects often are, applying is often worthwhile in expectation even if your odds of getting funding aren’t very high. (I think the same basic logic applies to job applications.)
I'm probably forgetting some opportunities relevant to longtermist and EA movement building work, and many opportunities relevant to other cause areas. Please comment if you know of things I’m missing!
This post doesn't include non-EA funding opportunities that would be well-suited to EA-aligned projects, though it'd probably be useful for someone to make a separate collection of such things.
I follow the name of each funding opportunity with some text from the linked page.
I wrote this post in a personal capacity, not as a representative of any of the orgs mentioned.
Currently open Open Phil funding opportunities
“We are seeking proposals from applicants interested in growing the community of people motivated to improve the long-term future via the kinds of projects described below.
Applications are open until further notice and will be assessed on a rolling basis. If we plan to stop accepting applications, we will indicate it on this page at least a month ahead of time.
See this post for additional details about our thinking on these projects.”
“Apply here (see below for details regarding application deadlines).
This program aims to provide support for highly promising and altruistically-minded students who are hoping to start an undergraduate degree at one of the top universities in the USA or UK (see below for details) and who do not qualify as domestic students at these institutions for the purposes of admission and financial aid.”
“This program aims to provide grant support to academics for the development of new university courses (including online courses). At present, we are looking to fund the development of courses on a range of topics that are relevant to certain areas of Open Philanthropy’s grantmaking that form part of our work to improve the long-term future (potential risks from advanced AI, biosecurity and pandemic preparedness, other global catastrophic risks), or to issues that are of cross-cutting relevance to our work. We are primarily looking to fund the development of new courses, but we are also accepting proposals from applicants who are looking for funding to turn courses they have already taught in an in-person setting into freely-available online courses.
Applications are open until further notice and will be assessed on a rolling basis.
“This program aims to provide support - primarily in the form of funding for graduate study, but also for other types of one-off career capital-building activities - for early-career individuals who want to pursue careers that help improve the long-term future1 and who don’t qualify for our existing program focused on careers related to biosecurity and pandemic preparedness.
Apply here by June 1st, 2022, at 11:59 p.m. Pacific Time (extended from January 21st originally).
We will review applications and make decisions on a rolling basis, so we encourage early applications. Generally speaking, we aim to review proposals within at most 6 weeks of receiving them, although this may not prove possible for all applications. Candidates who require more timely decisions can indicate this in their application forms, and we may be able to expedite the decision process in such cases.”
“This program aims to provide flexible support for a small group of people early in their careers to pursue work and study related to global catastrophic biological risks (GCBRs), events in which biological agents could lead to sudden, extraordinary, and widespread disaster. Our goal is to reduce risks to humanity’s long-run future, and this opportunity is aimed at people whose chief interest is GCBRs as they relate to the impact on the very long-run future.
Applications are due here by January 1st, 2022, at 11.59 p.m. Pacific Time. We will review applications and make decisions on a rolling basis.”
(They also previously provided a similar batch of funding: Early-Career Funding for Global Catastrophic Biological Risks — Scholarship Support (2018).)
“The Open Phil AI Fellowship is a fellowship for full-time PhD students focused on artificial intelligence or machine learning.
Applications are due by Friday, October 29, 2021, 11:59 PM Pacific time. Letters of recommendation are due exactly one week later, on Friday, November 5, at 11:59 PM Pacific time. Click the button below to submit your application:
Please ask your recommenders to submit letters of recommendation using this form:
With this program, we seek to fully support a small group of the most promising PhD students in AI and ML who are interested in research that makes it less likely that advanced AI systems pose a global catastrophic risk. Fellows receive a $40,000 stipend, $10,000 in research support, and payment of tuition and fees, each year, starting in the year of their selection until the end of the 5th year of their PhD.
Decisions will be sent out before March 31, 2022.
If you have questions or concerns, please email email@example.com.
Read on for more information about the Open Phil AI Fellowship.”
“As part of our work on reducing potential risks from advanced artificial intelligence, we are seeking proposals for projects working with deep learning systems that could help us understand and make progress on AI alignment: the problem of creating AI systems more capable than their designers that robustly try to do what their designers intended. We are interested in proposals that fit within certain research directions, described below, that we think could contribute to reducing the risks we are most concerned about.
Anyone is eligible to apply, including those working in academia, industry, or independently. Applicants are invited to submit proposals for up to $1M in total funding covering up to 2 years. We may invite grantees who do outstanding work to apply for larger and longer grants in the future.
Proposals are due January 10, 2022.
If you have any questions, please contact firstname.lastname@example.org.”
Currently open funding opportunities that aren’t from Open Phil
Recall that this is not exhaustive, and that I welcome comments mentioning things I missed.
“If you have a project you think will improve the world, and it seems like a good fit for one of our Funds, we encourage you to apply.
Grant sizes are typically between $5,000 and $100,000, but can be as low as $1,000 and higher than $300,000. EA Funds can make grants to individuals, non-profit organizations, academic institutions, and other entities. You do not need to be based in the US or the UK to apply for a grant. If you are unsure whether you are eligible to apply for a grant, please email email@example.com.
We sometimes meet people who did not apply because they thought they would not be funded. Some of them eventually applied and were funded, despite their doubts, because we were excited by their projects. Applying is fast and easy; we really do encourage it!
EA Funds is always open to applications.
EA Funds will consider funding applications from grantseekers who wish to remain anonymous in public reporting.
You can also suggest that we give money to other people, or let us know about ideas for how we could spend our money. Suggest a grant.”
“Survival and Flourishing (SAF; /sæf/) is a newly formed Sponsored Project of Social and Environmental Entrepreneurs, a 501(c)(3) public charity (proof of sponsorship; proof of charity status). SAF’s mission is to secure funding and fiscal sponsorship for projects that will benefit the long-term survival and flourishing of sentient life, including but not limited to humans.
SAF works closely with the Survival and Flourishing Fund (SFF), a donor advised fund with a similar mission and overlapping leadership. While we share no formal relationship with SFF, SAF and SFF have complementary functions:
- SAF as a general rule does not make grants to 501(c)(3) public charities, while SFF does, and
- SFF as a general rule does not make grants to individuals, while SAF does.”
"Sign up to our newletter to be notified of future funded project rounds!"
“Emerging technologies have the potential to help life flourish like never before – or self-destruct. The Future of Life Institute is delighted to announce a $25M multi-year grant program aimed at tipping the balance toward flourishing, away from extinction. This is made possible by the generosity of cryptocurrency pioneer Vitalik Buterin and the Shiba Inu community.
COVID-19 showed that our civilization is fragile, and can handle risk better when planning ahead. Our grants are for those who have taken these lessons to heart, wish to study the risks from ever more powerful technologies, and develop strategies for reducing them. The goal is to help humanity win the wisdom race: the race between the growing power of our technology and the wisdom with which we manage it.
We are excited to offer a range of grant opportunities within the areas of AI Existential Safety, Policy/Advocacy and Behavioral Science.
Our AI Existential Safety Program is launching first. Applications for PhD and Postdoctoral Fellowships are being accepted in the fall of 2021. We are working to build a global community of AI Safety researchers who are keen to ensure that AI remains safe and beneficial to humanity. You can see who is already part of the community on our website here.
We have a dedicated fund to support promising projects and individuals. The Center on Long-Term Risk Fund (CLR Fund) operates in line with our mission to build a global community of researchers and professionals working to do the most good in terms of reducing suffering.