Hide table of contents
Parent Topic: AI safety
You are viewing revision 1.18.0, last edited by Pablo

The AI alignment tag is used for posts that discuss aligning AI systems with human interests, and for meta-discussion about whether this goal is worthwhile, achievable, etc.

Evaluation

80,000 Hours rates AI alignment a "highest priority area": a problem at the top of their ranking of global issues assessed by importance, tractability and neglectedness (80,000 Hours 2021).

...

(Read more)

Posts tagged AI alignment

Relevance