Hide table of contents

The Stanford Existential Risks Conference will be taking place on February 26-27! Apply now!

SERI (the Stanford Existential Risk Initiative) will be bringing together the academic and professional communities dedicated to mitigating existential and global catastrophic risks — large-scale threats which could permanently curtail humanity’s future potential. Join the global community interested in mitigating existential risk for 1:1 networking, career/internship/funding opportunities, discussions/panels, talks and Q&As, and more. 

Apply at tinyurl.com/seri22apply by February 22, and refer individuals you think should attend at tinyurl.com/seri22refer!

More detail: 

SERI (the Stanford Existential Risk Initiative) will be bringing together the academic and professional communities dedicated to mitigating existential and global catastrophic risks — large-scale threats which could permanently curtail humanity’s future potential.

Join leading academics for 1:1 networking, exclusive panels, talks and Q&As, discussion of research/funding/internship/job opportunities, and more. The virtual conference will offer ample opportunities for potential collaborators, mentors and mentees, funders and grantees, and employers and potential employees to connect with one another. 

This virtual conference will provide an opportunity for the global community interested in safeguarding the future to create a common understanding of the importance and scale of existential risks, what we can do to mitigate them, and the growing field of existential risk mitigation. Topics covered in the conference include risks from advanced artificial intelligence, preventing global/engineered pandemics and risks from synthetic biology, extreme climate change, and nuclear risks. The conference will also showcase the existing existential risk field and opportunities to get involved - careers/internships, funding, research, community and more.

Speakers include Will MacAskill - Oxford Philosophy Professor, author of Doing Good Better, Sam Bankman-Fried - founder of Alameda Research and FTX, Stuart Russell, author of Human Compatible: Artificial Intelligence and the Problem of Control and Artificial Intelligence - A Modern Approach, and more!

Event Audience

This conference is aimed primarily at faculty, students, and professionals currently or planning on working in fields related to existential risk. We will also reserve spots for others who demonstrate significant familiarity with and/or interest in issues related to existential risk.

Apply Here! (~3 minutes) Refer friends/colleagues here!

This year, we will be providing more recommended connections, mentorship, resources and support to select attendees, so we encourage you to fill out the application questions in more detail if you might be interested in the above.

Applications are open until 11:59 pm PST on Tuesday, February 22. Accepted attendees will receive a registration link to the live event at a later date and will receive a free mailed copy of The Precipice!

The conference is entirely free of charge.

If you have any questions or want to learn more about the SERI Conference, visit sericonference.org or email seri-contact@stanford.edu.

28

0
0

Reactions

0
0
Comments
No comments on this post yet.
Be the first to respond.
More from kuhanj
Curated and popular this week
Relevant opportunities