I work on the Longtermist Effective Altruism Community Growth cause area at Open Philanthropy; we aim to empower and grow the number of people trying to help make the long-term future as good as possible. (See this recent post for more about what we do and what’s going on with us as a team.)
About one year ago my coworkers Claire Zabel and Asya Bergal put out a call for proposals for outreach and community-building projects that look good on a longtermist worldview. This post is meant to 1) serve as an update for readers on what’s been happening with this funding program, and 2) serve as a reminder that this funding program exists, and a renewed call for proposals.
But we still have the capacity to support many more projects, and a wider variety of types of projects, than we currently do. This program is currently limited by the number and quality of applications, not by available funding, and will be for the foreseeable future. See outlines of projects and project areas we’re excited to see applications in.
If you have an idea for work aimed at supporting or growing the set of people who do good work on longtermist projects, we’d love to get an application from you. You can apply here. Only a brief pre-proposal is required for the initial application, and we’re open to making “planning grants” where you can apply for funding for up to 3 months just to develop an idea.
Applications are assessed on a rolling basis. See more details on the application process below.
If you have a question about this funding program, here’s three ways of getting them answered:
- This post is an AMA post; feel free to Ask Me Anything in the comments. I’ll be answering questions until August 19th. I’ll aim to get through all questions, but will triage according to vote count and may not get to all of them. Questions about future plans, things we should be doing differently, logistical questions, etc. are all welcomed.
- I’ll also be hosting office hours on August 18 — August 19 and August 22 in the mornings Pacific Time. Sign up here to talk to me for 15 minutes in person on Zoom. (I’m friendly!)
- You can contact us at firstname.lastname@example.org for any questions you think are better suited to email. We reply to every email. If you’re not sure whether your project is a good fit for this funding opportunity, please don’t hesitate to contact us by email.
Some projects we’ve funded through this call for proposals
- Eon Essay Contest (also see the Forum post) — a contest where students, aged high-school and above, can win scholarship money by reading Toby Ord’s The Precipice and writing essays about it. Run by Neha Singh.
- Asterisk — a new quarterly magazine/journal of ideas from in and around Effective Altruism, run by Clara Collier.
- The Apollo Fellowship — a summer program for young competitive debaters, intended to introduce them to “a wide variety of philosophical and technological issues, including artificial intelligence, long-termism and existential risk, utilitarianism, and more.” Run by Jason Xiao and Sam Huang.
- Funding for EA Brazil and EA Japan to translate articles, essays and videos about effective altruism (and related ideas), including the Intro Fellowship materials, into Portuguese and Japanese, to widen the reach of these ideas. We worked with Ramiro Peres from EA Brazil, and Luis Costigan from EA Japan, on these grants.
This is just a small sample of the projects we’ve funded, chosen to showcase variety rather than for representativeness.
Statistics on application outcomes
We’ve received 94 applications overall, of which about 24 didn’t seem to be related to effective altruism or longtermism at all.
Of the remaining 70 applications, we:
- Funded 25.
- Rejected 28.
- Referred 15 to the EA Infrastructure Fund or the Long-Term Future Fund, with the applicants’ permission.
- Are still evaluating one.
- Started evaluating one but the applicant withdrew, I believe (this application was handled by a colleague).
Outlines of projects and project areas we’re excited to see applications in
Some work may fall into multiple categories.
- In-person programs engaging with promising young people high-school-aged and up, e.g. retreats, summer camps, scholarships and fellowships, seminars, conferences, and workshops.
- See our previous post’s explanation of why we think this kind of work is particularly promising and what we think is involved in doing it well (including potential downsides).
- AI safety-focused meta work, i.e. aiming specifically at causing more people who are good fits for AI safety research to work on it (e.g. projects like EA Cambridge’s AGI Safety Fundamentals).
- Cause-specific meta work focused on other longtermist cause areas.
- Projects aimed at producing more excellent content on EA, longtermism, transformative technology, and similar topics, and getting it seen by many people whom it might get further interested in these topics. This could include:
- Blog posts
- Articles on the web
- New magazines, webzines, blogs, and media verticals
- Nonfiction books
- YouTube videos
- Many kinds of fiction (novels, web fiction, video series or TV shows…)
- Work on advertising and marketing for high-quality content on EA, longtermism, and/or transformative technology.
- Rationality-and-epistemics-focused community-building work, which could include:
- Retreats or events based around epistemics and rationality.
- Creating high-quality content around epistemics and rationality.
- Work that tries to make EA ideas and discussion opportunities more accessible outside current EA hubs, especially outside the Anglophone West. This includes proposals to translate content into non-English languages.
We’re also interested in receiving applications to our University Group Organizer Fellowship to support university groups that are aimed at these goals, including (but not limited to!):
- AI safety-focused university groups (e.g. Harvard AI Safety Team).
- Rationality or epistemics-focused university groups.
- Groups at universities outside of the Anglophone West.
See Asya and Claire’s recent post for more info about this.
More details on the application process
(This section is similar to the corresponding section in the original post.)
The application form is here. Only a brief pre-proposal (mainly a project description of <750 words) is required at this stage. If we are interested in supporting your project, we will reach out to you and invite you to submit more information.
We encourage submissions from people who are uncertain if they want to found a new project and just want funding to seriously explore an idea. In many cases we’re open to giving a “planning grant,” which is funding for up to 3 months to test and iterate on a project idea, without committing to it. We’re happy to look at multiple pre-proposals from applicants who have several different project ideas.
We may also be able to help some applicants (by introducing them to potential collaborators, giving them feedback about plans and strategy, providing legal assistance, etc.) or be able to help find others who can. We have funded, and continue to be open to, very ambitious proposals for projects that have annual budgets in the millions, including proposals to scale existing projects that are still relatively small.
There is no deadline to apply; applications are assessed on a rolling basis. We’ll leave this form open indefinitely until we decide that this program isn’t worth running, or that we’ve funded enough work in this space. If that happens, we will update this post noting that we plan to close the form at least a month ahead of time.
As mentioned above, if you’re wondering if your project is a potential fit, or if you have any other questions, you can contact us at email@example.com.
Note this can include outreach or community-building work about EA in general that doesn’t focus on longtermism in particular, or outreach aimed at particular longtermist cause areas. ↩︎