Hide table of contents

The AI Safety Fundamentals courses are one of the best ways to learn about AI safety and prepare to work in the field.

BlueDot Impact facilitates the courses several times per year, and the curricula are available online for anyone to read. 

The “Alignment” curriculum is created and maintained by Richard Ngo (OpenAI), and the “Governance” curriculum was developed in collaboration with a wide range of stakeholders. 

You can now listen to most of the core readings from both courses:

AI Safety Fundamentals: Alignment
Gain a high-level understanding of the AI alignment problem and some of the key research directions which aim to solve it.


Listen online or subscribe:
Apple Podcasts | Google Podcasts | Spotify | RSS

AI Safety Fundamentals: Governance
Gain foundational knowledge for doing research or policy work on the governance of transformative AI.

Listen online or subscribe:
Apple Podcasts | Google Podcasts | Spotify | RSS

We've also made narrations for some readings from the advanced “Alignment 201” course, and we may record more later this year:

AI Safety Fundamentals: Alignment 201
Gain enough knowledge about alignment to understand the frontier of current research discussions. 

Listen online or subscribe:
Apple Podcasts | Google Podcasts | Spotify | RSS

Apply to join the “AI Safety Fundamentals Governance Course” July cohort!

Gain foundational knowledge for doing research or policy work on the governance of transformative AI.

Successful applicants will participate in the AI Governance course with weekly virtual classes, and join the AI Safety Fundamentals community.

Apply before 26th June 2023!

https://apply.aisafetyfundamentals.com/governance


Thoughts, feedback, suggestions?

These narrations were created by Perrin Walker (TYPE III AUDIO) on behalf of BlueDot Impact, with support from the rest of the team at TYPE III AUDIO.

We would love to hear your feedback. Do you find the narrations helpful? How could they be improved? What other AI safety material would you like to listen to? Please comment below, complete our feedback form, or write to team@type3.audio.

101

0
0

Reactions

0
0
Comments8
Sorted by Click to highlight new comments since: Today at 10:51 AM

Can I promote your courses without restraint on Rational Animations? I think it would be a good idea since people can go through the readings by themselves. My calls to action would be similar to this post I made on the Rational Animations' subreddit: https://www.reddit.com/r/RationalAnimations/comments/146p13h/the_ai_safety_fundamentals_courses_are_great_you/

That sounds great to me, thanks!

does anyone know what is the reasoning behind naming change (from AGI to AI safety fundamentals)?

We'll aim to release a short post about this by the end of the week!

Some possible bugs: 

*When I click on the "listen online" option it seems broken (using this on a mac)

*When I click on the "AGI safety fundamentals" courses as podcasts, they take me to the "EA forum curated and popular" podcast. Not sure if this is intentional, or if they're meant to point to a podcast containing just the course

Thanks! Now fixed.

this is great, thanks! listening is so much easier for me; i can easily listen and comprehend for 8+ hours a day, but with reading i get distracted easily after less than an hour, partly because the act of scanning words takes active focus, but comprehending and thinking are easy for me. (i might have something adhd-adjacent)

i was looking into ai text-to-speech readers before, since there's lots i'd like to read, but i couldn't find a good one. (https://www.naturalreaders.com/online/ is okay, but not ideal for me, not near the quality of solenoid entity's readings of the sequences.)

I also sometimes use naturalreaders. Unfortunately I find it a bit... unnatural at times.

I've been really enjoying Type III Audio's reader on this forum, though!