The AI Alignment Forum is a forum for discussing technical research on AI alignment that superseded the Agent Foundations Forum, established around 2015 (LaVictoire 2015).

A beta version of the site, at the time named the Alignment Forum, was announced on 10 July 2018 (Arnold 2018). The site under its current name was officially launched on 29 October 2018 (Habryka et al. 2018). The authors describe its purpose as follows:

Our first priority is obviously to avert catastrophic outcomes from unaligned Artificial Intelligence. We think the best way to achieve this at the margin is to build an online-hub for AI Alignment research, which both allows the existing top researchers in the field to talk about cutting-edge ideas and approaches, as well as the onboarding of new researchers and contributors.


(Read More)

Posts tagged AI Alignment Forum
Most Relevant