AI has enormous beneficial potential if it is governed well. However, in line with a growing contingent of AI (and other) experts from academia, industry, government, and civil society, we also think that AI systems could soon (e.g. in the next 15 years) cause catastrophic harm. For example, this could happen if malicious human actors deliberately misuse advanced AI systems, or if we lose control of future powerful systems designed to take autonomous actions.[1]

To improve the odds that humanity successfully navigates these risks, we are soliciting short expressions of interest (EOIs) for funding for work across six subject areas, described here.

Strong applications might be funded by Good Ventures (Open Philanthropy’s partner organization), or by any of >20 (and growing) other philanthropists who have told us they are concerned about these risks and are interested to hear about grant opportunities we recommend.[2] (You can indicate in your application whether we have permission to share your materials with other potential funders.)

Click here to read the full RFP

As this is a new initiative, we are uncertain about the volume of interest we will receive. Our goal is to keep this form open indefinitely; however, we may need to temporarily pause accepting EOIs if we lack the staff capacity to properly evaluate them. We will post any updates or changes to the application process on this page.

Anyone is eligible to apply, including those working in academia, nonprofits, industry, or independently.[3] We will evaluate EOIs on a rolling basis. See below for more details.

If you have any questions, please email us. If you have any feedback about this page or program, please let us know (anonymously, if you want) via this short feedback form.

73

0
0
1

Reactions

0
0
1
Comments2
Sorted by Click to highlight new comments since:

might be worth defining RFP = request for proposal

Executive summary: Open Philanthropy is soliciting funding proposals for work aimed at mitigating catastrophic risks from advanced AI systems, focusing on six key subject areas related to AI governance and policy.

Key points:

  1. Eligible subject areas include technical AI governance, policy development, frontier company policy, international AI governance, law, and strategic analysis.
  2. Proposal types can be research projects, training/mentorship programs, general support for existing organizations, or other projects.
  3. Evaluation criteria include theory of change, track record, strategic judgment, project risks, cost-effectiveness, and scale.
  4. Application process begins with a short Expression of Interest (EOI) form, followed by a full proposal if invited.
  5. Funding is open to individuals and organizations globally, with typical initial grants ranging from $200k-$2M/year over 1-2 years.
  6. Open Philanthropy aims to respond to EOIs within 3 weeks and may share promising proposals with other potential funders.

 

 

This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.

Curated and popular this week
Relevant opportunities