I created two AI Alignment playlists on Youtube. One that is slide-heavy and the other is not. I separated them into two playlists for two reasons.

  1. It’s useful to separate for a dataset I am working on.
  2. Media is easier to consume when you don’t have to pay attention to the slides and pictures someone is describing.

Not slide-heavy (currently 216 videos): https://youtube.com/playlist?list=PLTYHZYmxohXp0xvVJmMmpT_eFJovlzn0l 

Slide-heavy (currently 366 videos): https://youtube.com/playlist?list=PLTYHZYmxohXpn5uf8JZ2OouB1PsDJAk-x

If you would like to contribute and add more videos to the playlists or create new Alignment-relevant playlists, let me know!

If you like access to the audio and youtube auto-generated subs in .txt format, I have stored them here: https://drive.google.com/drive/folders/1qVo4TyHKrsJvbJ3UrIOLW45j_7_wwnbZ?usp=sharing 

I've batched up the files into buckets of 90-ish hours (except for the final bucket which is less) since I plan on loading them into otter.ai and that website only accepts 100 hours per user (per month). Additionally, if you would like to help load some of the audio files in your own otter.ai account, please let me know! I want to create transcripts of the audio files and add them to a dataset very soon.

16

0
0

Reactions

0
0
Comments2
Sorted by Click to highlight new comments since: Today at 3:10 AM

This isn't specifically AI alignment-related, but I found this playlist on defending utilitarian ethics. It discusses things like utility monsters and the torture vs. dust specks thought experiment, and is still somewhat relevant to effective altruism.

Saving for potential future use. Thanks!

Curated and popular this week
Relevant opportunities