Hide table of contents

AI evaluations and standards

Parent Topic: AI safety

AI evaluations and standards (or "evals") are processes that check or audit AI models. Evaluations can focus on how powerful models are (“capability evaluations”) and on whether models are exhibiting dangerous behaviors or are misaligned (“alignment evaluations” or "safety evaluations").Working on AI evaluations might involve developing standards and enforcing compliance with the standards.Evaluations can help labs determine whether it's safe to deploy new models, and can help with AI governance and regulation.

...

(Read more)

Posts tagged AI evaluations and standards

Relevance
106
Mau
· 1y ago · 9m read
3
3