This is a linkpost for https://grants.futureoflife.org/

Epistemic status: Describing the fellowship that we are a part of and sharing some suggestions and experiences.

The Future of Life Institute is launching its 2023 cohort of PhD and postdoctoral fellowships to study AI existential safety: that is, research that analyzes the most probable ways in which AI technology could cause an existential catastrophe, and which types of research could minimize existential risk; and technical research which could, if successful, assist humanity in reducing the existential risk posed by highly impactful AI technology to extremely low levels. More information about the 2022 cohort can be found here.

The Vitalik Buterin PhD Fellowship in AI Existential Safety is targeted at students applying to start their PhD in 2023, or existing PhD students who would not otherwise have funding to work on AI existential safety research. Quoting from the page:

At universities in the US, UK, or Canada, annual funding will cover tuition, fees, and the stipend of the student's PhD program up to $40,000, as well as a fund of $10,000 that can be used for research-related expenses such as travel and computing. At universities not in the US, UK or Canada, the stipend amount will be adjusted to match local conditions. Fellows will also be invited to workshops where they will be able to interact with other researchers in the field.

In addition, applicants who are short-listed for the Fellowship will be reimbursed for application fees for up to 5 PhD programs, and will be invited to an information session about research groups that can serve as good homes for AI existential safety research.

Applications for the PhD fellowship close on Nov 15, 2022.

The Vitalik Buterin Postdoctoral Fellowship in AI Existential Safety is for postdoctoral appointments starting in fall 2023. Quoting from the page:

For host institutions in the US, UK, or Canada, the Fellowship includes an annual $80,000 stipend and a fund of up to $10,000 that can be used for research-related expenses such as travel and computing. At universities not in the US, UK or Canada, the fellowship amount will be adjusted to match local conditions.

Applications for the postdoctoral fellowship close on Jan 2nd, 2023.

We (Cynthia Chen and Zhijing Jin) are two of the fellows from the 2022 class, and we strongly recommend whoever sees fit to apply! We especially appreciate these aspects of the fellowship:

  1. Having access to the broad and vibrant AI existential safety network at FLI.
  2. Participating in seminars and communicating insights about AI safety with peers and professors.
  3. Having the freedom to work on the most important AI safety problems during our PhD, without constraints from the supervisors.
  4. If you’re applying to PhD this year, having obtained a fellowship that can fully fund your research can make you especially advantageous in your application.

You can apply at grants.futureoflife.org, and if you know people who may be good fits, please help spread the word!

No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities