TLDR: In collaboration with CEA, an external collaborator is trying to determine the value of various AI alignment resources. If you are interested in or currently working on AI alignment, please fill out this survey. We expect it will likely take 5-15 minutes. Thanks!


There are a lot of resources available to learn about AI Alignment. For field builders and those interested in the field, it can be difficult to know when to prioritize which readings. To address this, an external collaborator has created a survey to collect feedback on the resources you’ve engaged with, in collaboration with the Centre for Effective Altruism (CEA). 

We hope this survey will:

  1. Inform which materials have previously helped people get into alignment. We can then promote those materials broadly or target them to particularly relevant groups (e.g. students).
  2. Identify gaps where more materials are needed.
  3. Provide feedback to those who made the materials, encouraging them to create more or try new stuff!

If you are interested in or currently working on AI Alignment, we would be so grateful for your input. We expect this survey to take around 5-10 minutes. 

Once you’ve filled it out, please consider sharing this survey with others in your network.

We appreciate your time. Thanks!


New Comment