At the Digital Platforms policy brief press conference on Monday, UN Secretary-General António Guterres started his speech with:

Distinguish members of our press corps. New technology is moving at warp speed, and so are the threats that come with it. Alarm bells over the latest form of Artificial Intelligence - generative AI - are deafening. And they are loudest from the developers who designed it. These scientists and experts have called on the World to act, declaring AI an existential threat to humanity on a par with the risk of nuclear war[1]. We must take those warnings seriously. Our proposal, Global Digital Compact New Agenda for Peace and Accord on the Global Governance of AI, will offer multilateral solutions based on human rights[2].

(Video here.)

Guterres went on to discuss current damage from digital technology ("but the advent of generative AI must not distract us from the damaged digital technology is already doing to our world").

The opening mention of existential threat from AI is a very welcome development in terms of the possibility of global coordination on the issue.

  1. ^

    It seems likely that the CAIS Statement on AI Risk - "Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war." -  was instrumental in prompting this, given the mention of nuclear war.

  2. ^

    In terms of extinction risk, remember that the right to life is first and foremost!

Comments1
Sorted by Click to highlight new comments since: Today at 7:39 PM

Hopefully everyone who thinks that AI is the most pressing issue takes the time to write (or collaborate and write) their best solution in 2000 words and submit to the UN's recent consultation call: https://dig.watch/updates/invitation-for-paper-submissions-on-worldwide-ai-governance A chance to put AI in the same global governance basket as biological and nuclear weapons. And potential high leverage from a relatively small task (Deadline 30 Sept). 

Curated and popular this week
Relevant opportunities