Hide table of contents

Many AI labs have called for the democratization of AI. In a recent GovAI blog post, Elizabeth Seger summarizes four different ways of interpreting the phrase:

  • Democratizing AI use: Making it easier for people to use AI technologies
  • Democratizing AI development: Making it easier for a wider range of people to contribute to the development and design of AI systems
  • Democratizing AI benefits: Ensuring that the benefits of AI systems are widespread
  • Democratizing AI governance: Ensuring that decisions involving AI systems are informed by a variety of stakeholders and reflect democratic processes

Things I like about the post

  • “Democratizing AI” is a vague phrase, and the post usefully distinguishes between various ideas that the term can refer to.
  • The framework can help us distinguish between forms of democratization that are relatively safe and those that carry more risks.
    • Ex: Democratizing AI benefits seems robustly good, whereas democratizing AI use has risks (given that AI can be dual-use). 
  • The framework could allow AI developers to maintain their commitment to (some forms of) democratizing AI while acknowledging that some forms carry risks.  
  • The post analyzes decisions by AI labs through the lens of the framework. Example:

In declaring the company’s AI models will be made open source, Stability AI created a situation in which a single tech company made a major AI governance decision: the decision that a dual-use AI system should be made freely accessible to all. (Stable diffusion is considered a “dual-use technology” because it has both beneficial and damaging applications. It can be used to create beautiful art or easily modified, for instance, to create fake and damaging images of real people.) It is not clear, in the end, that Stability AI’s decision to open source was actually a step forward for the democratisation of AI governance.

  • The post is short! (About 5 minutes)

Read the full post here

35

0
0

Reactions

0
0
No comments on this post yet.
Be the first to respond.