Hide table of contents

Introduction

In this grants report, the Longtermism Fund team is pleased to announce that the following grants have been recommended by Longview and are in the process of being disbursed:

This report will provide information on what the grants will fund, and why they were made. It was written by Giving What We Can, which is responsible for the Fund's communications. Longview Philanthropy is responsible for the Fund's research and grantmaking.[1]

We would also like to acknowledge and apologise for the report being released two months later than we would have liked, in part due to delays in the process of disbursing these grants. In future, we will aim to take potential delays into account so that we can better keep to our target of releasing a report once every six months. 

Scope of the Fund

These grants were decided by the general grantmaking process outlined in our previous grants report and the Fund’s launch announcement

As a quick summary, the Fund supports work that:

  1. Reduces existential and catastrophic risks, such as those coming from misaligned artificial intelligencepandemics, and nuclear war.
  2. Promotes, improves, and implements key longtermist ideas.

In addition, the Fund focuses on organisations with a compelling and transparent case in favour of their cost-effectiveness, and/or that will benefit from being funded by a large number of donors. Longview Philanthropy decides the grants and allocations based on its past and ongoing work to evaluate organisations in this space.    

Grantees

AI interpretability work at Harvard University — $110,000

This grant is to support the work of Martin Wattenberg and Fernanda Viégas to develop their AI interpretability work at Harvard University. The grant aims to fund research that enhances our understanding of how modern AI systems function — better understanding how these systems work is among the more straightforward ways we can ensure these systems are safe. Profs. Wattenberg and Viégas have a strong track record (with both having excellent references from other experts) and their future plans are likely to advance the interpretability field.

Longview: “We recommended a grant of $110,000 to support Martin Wattenberg and Fernanda Viégas’ interpretability work on the basis of excellent reviews of their prior work. These funds will go primarily towards setting up a compute cluster and hiring graduate students or possibly postdoctoral fellows.” 

Learn more about this grant.

ARC Evals — $220,000

The evaluations project at the Alignment Research Center (“ARC Evals") works on “assessing whether cutting-edge AI systems could pose catastrophic risks to civilization.” ARC Evals is contributing to the following AI governance approach:

  • Before a new large-scale system is released, assess whether it is capable of potentially catastrophic activities.
  • If so, require strong guarantees that the system will not carry out such activities.

ARC Evals works primarily on the first step of this approach.

The organisation is relatively new, and is now scaling up after seeing success. For example, ARC Evals built partnerships with frontier labs OpenAI and Anthropic to evaluate GPT-4 and Claude for certain dangerous capabilities prior to their release. At least as of the time this was published, the organisation has substantial room for more funding — on the order of millions of dollars needed to support its plans over the coming 18 months. 

Longview: “We recommended a grant of $220,000 to ARC Evals on the basis of ARC Evals’ strong plan for contributing to AI governance and promising early progress. These funds will go primarily towards staff costs, and possibly computation, depending on ARC Evals’ overall fundraising.”

Learn more about ARC Evals.

Nuclear Threat Initiative’s Biosecurity Programme (NTI | Bio) project to develop a research agenda for disincentivizing state biological weapons programmes — $100,000

This grant will support NTI | Bio for a specific project aiming to strengthen international capabilities to uphold the norm against bioweapons development and use. Concretely, this involves organising a workshop with leading experts on the topic to develop a list of key recommendations. To read about the kind of work involved in this project, we recommend reading the NTI | Bio paper “Guarding Against Catastrophic Biological Risks: Preventing State Biological Weapon Development and Use by Shaping Intentions”. The grant is restricted to this project.[2]

Longview: “We recommended a grant of $100,000 to support this work on the basis that it was likely the most promising work which NTI | Bio would not otherwise have funding available for, and NTI’s track record of running similar projects. These funds will go primarily towards the workshop, with a smaller portion towards staff costs.” 

Learn more about NTI | Bio.

Center for Communicable Disease Dynamics (CCDD) — $80,000

This grant provides funding for CCDD to employ a Director of Research and Administration to support CCDD’s work. This role acts as a force multiplier on all of CCDD’s work, which after reviewing their work several times over the last few years, Longview believes is impactful.

CCDD’s research contributes to planning for and reducing the chance of global catastrophic biological risks including influencing policy such as by estimating disease spread, researching vaccine trials (such as by publishing the original research on the potential value of human challenge trials to address COVID-19), and training future epidemiologists (such as training several Epidemic Intelligence Service officers). Its director, Professor Marc Lipsitch, spends around a quarter of his time as the Senior Advisor for the CDC’s Center for Forecasting and Outbreak Analytics, where he was founding co-director, and former CCDD Postdoctoral Research Fellow Rebecca Kahn was also on the founding team. Prof. Lipsitch is also a nuanced contributor to important debates such as around research with the potential to be used for both good and harm. Donors can learn more about these topics via his appearance on various podcasts and media.

This grant helps fill a particular funding gap that CCDD reported could otherwise be difficult to fill — CCDD is mostly funded via the US government and large foundations, but generally this funding is restricted to direct research (rather than supporting research via funding operational or administrative roles).

Longview: “We recommended a grant of $80,000 to support Laurie Coe’s position as CCDD’s Director of Research and Administration on the basis that this will be a force multiplier on work increasing the world’s readiness for and reducing the chance of catastrophic pandemics, and because CCDD has a pressing need for funding to support this role.”

Learn more about the Center for Communicable Disease Dynamics.

Carnegie Endowment for International Peace (CEIP)— $52,000

This grant supports CEIP to run a project aiming to develop a common understanding about escalation pathways to nuclear war and which policy interventions are most likely to contribute to risk mitigation. More specifically, this grant will help fund research workshops whereby a diverse range of experts within fields relevant to nuclear security and risk analysis convene to analyse potential escalation pathways, attempt to estimate their likelihood, identify potential levers to reduce or mitigate this risk, and compare these various pathways and levers more holistically. 

The project will be run by James Acton and Jamie Kwong and will result in a report with policy recommendations and outreach to decision makers to promote these policy changes.

Longview: “We recommended a grant of $52,000 to support the project on escalation pathways on the basis of its direct relevance to reducing the most extreme risks from nuclear weapons, and the CEIP team’s strong track record of high-quality analysis which is taken seriously by policymakers. These funds will go primarily towards workshops and project staff time.”    

Learn more about this grant.

Conclusion

The Fund is approaching the end of its first year, and the team is extremely grateful to the 598 donors who have cumulatively raised over $750,000 USD so far. We can all help solve funding constraints — your donations, and your advocacy, can make an enormous difference to protect the lives of future generations. 

  1. ^

    Prior to disbursing funds, we need to conduct due-diligence on the grantee (and occasionally the grantee needs to conduct due-diligence on us) and form a grant agreement. For this report, we decided to delay publishing until the grants were further through this process than we had done in the last round of grants. Each of the August 2023 grants have passed the due-diligence stage, and the final grant agreements are in the process of being signed as of this report being published. We share this because, in retrospect, we shouldn't have stated in the last grants report that the funds would be paid out in January, as we didn't have full control over this (as due diligence and grant agreement processes were still ongoing).

  2. ^

     This grant is restricted for support of work which is unlikely to go ahead without the grant. Therefore, SoGive’s recent post about the Nuclear Threat Initiative’s funding reserves was not directly relevant to the merit of the grant, and the Fund did not come to a view on the content of SoGive’s post.

Comments3
Sorted by Click to highlight new comments since: Today at 5:36 AM

Congratulations! Martin and Fernanda do great work, and I'm glad to see them being supported.

awesome write up.  and happy to be supporting this fund.

Congratulations! Exciting to see that the fund has been success in its first year!

Curated and popular this week
Relevant opportunities