Hide table of contents

The Long-Term Future Fund is pleased to invite applications to our Q1 2025 funding round. We are seeking to support individuals and organizations working to help humanity navigate existential risks or working on projects improving humanity’s long-term flourishing.

Apply now

Deadline for applying is February 15th. We will respond to applications submitted before then by March 31st, and should have most funding disbursed by April 30th.

Why Grant Rounds?
After a period of accepting applications on a rolling basis, we're trialling returning to focused grant rounds. I am hoping this will:

  • Allow us to recalibrate our funding thresholds given a rapidly changing giving environment
  • Provide donors with clearer feedback on the impact of their contributions
  • Offer applicants predictable response timelines

For particularly time-sensitive opportunities, we maintain a fast-track application process, though we expect the bar to be substantially higher for funding through this channel (i.e., your odds of receiving funding are lower if you apply through the fast-track application).

What We Fund
We continue to focus on reducing existential risk (x-risk) and supporting work that could radically improve humanity's future. In practice, we expect the clear majority of projects we support to be focused on reducing x-risk from AI. Example focus areas include:

  • Research and analysis on x-risk and transformative technologies, with a focus on existential risks from Artificial Intelligence
  • Ecosystem support to help others work on x-risk and transformative technologies
  • Field-building and improving epistemics both within the extended ecosystem of people working on the long-term future, and among important decision-makers
  • Efforts to improve global governance of transformative technologies, again with a particular focus on AI

Who Can Apply?
Individuals and organizations worldwide are welcome to apply. We frequently fund:

  • Novel research projects
  • Independent researchers and writers
  • Early-stage organizations
  • Event organizers
  • Promising individuals seeking career transitions into x-risk work

Timeline

  • Application deadline: February 15th, 2025
  • Decisions communicated: By March 31st, 2025
  • Funding disbursed: April 2025

Giving Environment
The funding environment has changed considerably over the past year. Our sense is there are now many more highly-impactful potential donation targets while availability and flexibility of funding for many of those projects is substantially reduced. We aim to use this round to recalibrate our bar for funding and help focus our limited evaluator attention more effectively.

What happens to existing applications?
We aim to get back to all applications currently in our pipeline before February 1st. We will fund the most high-priority applications on an expedited basis, but expect to defer most of the promising applications presently in the pipeline to be evaluated in the grant round (and will send emails to existing applicants with instructions on how to make sure they get included in the round if they want to, and when they should expect to get a response from us).

What if I miss the application deadline?
That’s fine – we will continue to have an always-open application. If you submit your application after the deadline has passed, we’ll simply evaluate your application in the next funding round, which will likely be staggered from this round by a few months.

Questions?
We will announce an Ask Me Anything (AMA) soon, likely answering questions in January. In the meantime, you’re welcome to:

We will also be updating the website and application form with new details in the next few days; please bear with us if details on the website and this announcement do not exactly match. 

We look forward to reviewing your applications! And if you know someone doing valuable work in this space, please encourage them to apply.

Comments12


Sorted by Click to highlight new comments since:

Thanks for the update! I appreciate that decisions will be communicated within 1.5 months of the application deadline.

Btw, this URL (https://funds.effectivealtruism.org/funds/far-future/apply) you link to leads to "Page not found".

Thanks. Should now be fixed!

Hey, this is still (again?) 404 - is there some other fast track application option?

Appreciate it! @BrianTan and others, feel free to use this thread as a way to report other issues and bugs with the website/grant round announcement. 

The grant application page reads "The EA Infrastructure Fund and the Long-Term Future Fund are currently unable to make grants with an end date after August 31st 2025, and any applications to these funds must have a grant period which ends on or before this date."

If a grantee has funds left over at the end of August, is it expected that the balance of the funds would need to be returned at that point? Or some other default behaviour?

I'm thinking of this from the perspective of managing a project that has had several grants made and how much they need to prioritise spending down any LTFF grant.

@calebp @Linch 

  1. Can someone from LTFF confirm if this restriction (that the grant period should end on or before Aug 31, 2025) will be set for all applications in this Q1 grant round? (When I first saw the note referenced above around December, I thought it would lift before the Feb 15 deadline, but it seems like it's not going to be lifted by then?)
  2. I also have the same question as Pip above!

We have actually just managed to lift that restriction for the ltff, the application form and website should be updated to reflect this in the next week. This means that no balance will need to be returned after August if the project is still active.

What caused the restriction?
I'm noticing I'm confused. I have no hypothesis for what could case that sort of restriction.

So to be clear, are we able to apply past the end date of August 31 in our initial grant application, or are we capped to August 31 with the ability to extend the grant later? The application form still has the August 31 figure. Thanks!

I'm not from LTFF but I believe you can apply for funding with an end date past Aug 31! The application form now mentions it's only the EAIF that has that restriction.

Awesome, thanks for the update :-)

This funding opportunity has been added to the EA Opportunities Board!

Curated and popular this week
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to
 ·  · 4m read
 · 
Forethought[1] is a new AI macrostrategy research group cofounded by Max Dalton, Will MacAskill, Tom Davidson, and Amrit Sidhu-Brar. We are trying to figure out how to navigate the (potentially rapid) transition to a world with superintelligent AI systems. We aim to tackle the most important questions we can find, unrestricted by the current Overton window. More details on our website. Why we exist We think that AGI might come soon (say, modal timelines to mostly-automated AI R&D in the next 2-8 years), and might significantly accelerate technological progress, leading to many different challenges. We don’t yet have a good understanding of what this change might look like or how to navigate it. Society is not prepared. Moreover, we want the world to not just avoid catastrophe: we want to reach a really great future. We think about what this might be like (incorporating moral uncertainty), and what we can do, now, to build towards a good future. Like all projects, this started out with a plethora of Google docs. We ran a series of seminars to explore the ideas further, and that cascaded into an organization. This area of work feels to us like the early days of EA: we’re exploring unusual, neglected ideas, and finding research progress surprisingly tractable. And while we start out with (literally) galaxy-brained schemes, they often ground out into fairly specific and concrete ideas about what should happen next. Of course, we’re bringing principles like scope sensitivity, impartiality, etc to our thinking, and we think that these issues urgently need more morally dedicated and thoughtful people working on them. Research Research agendas We are currently pursuing the following perspectives: * Preparing for the intelligence explosion: If AI drives explosive growth there will be an enormous number of challenges we have to face. In addition to misalignment risk and biorisk, this potentially includes: how to govern the development of new weapons of mass destr
 ·  · 2m read
 · 
2024 marked 10 years since we launched Open Philanthropy. We spent our first decade learning (about grantmaking, cause selection, and the history of philanthropy), and growing our team and expertise to be able to effectively deploy billions of dollars from Good Ventures, our main funder. Our early grants — and some grantees we’ve helped get started — are now old enough that we can see material signs of our impact in the world. The start of our second decade also marked a major change in our direction. With Good Ventures approaching the level of spending consistent with its founders’ ambition to spend down in their lifetimes, we finally began to execute at scale on our long-held ambition to support other funders, and found a surprising degree of early success. I expect that our ambition to serve additional partners will guide much of our second decade. A few highlights from the year: * We launched the Lead Exposure Action Fund (LEAF), a >$100 million collaborative fund to reduce lead exposure globally. LEAF marked our first major foray into partnering with other funders beyond Good Ventures, and we’re planning to do a lot more in this vein going forward — more below. * Our longtime grantee David Baker won the Nobel Prize in Chemistry for his groundbreaking work using AI for protein design. We’re proud to have supported both the basic methods development and the potentially high-impact humanitarian applications of his work for ailments like syphilis, hepatitis C, snakebite, and malaria. * Our grantee Open New York played an important role in the recent passage of New York City’s largest zoning overhaul in over 60 years. The city planning department expects the package to create 80,000 new homes over 15 years, making this the first set of major YIMBY reforms to pass in New York City. * Research mentorship programs that we fund continue to produce some of the top technical talent in AI safety and security. Graduates of programs like MATS, the Astra Fellowship, LA