AI pause advocates often say they are pro-technology and pro-economic growth, and that they simply make one exception for AI because of its unique risks. But this reasoning will grow less credible over time as AI comes to account for a larger and larger share of economic growth.
Simple growth models predict that AI capable of substituting for human labor will raise economic growth rates by an order of magnitude or more. If that's right, then AI will eventually be driving the vast majority of technological innovation and improvements in the standard of living. Stopping AI really would be like halting technology itself, because you would be shutting off the source of nearly all growth.
This suggests that proposing to pause AI today is like proposing to pause electricity in 1880: yes, electricity is technically just one technology among many, but pausing it would threaten to shut down progress on most of the others.
More fundamentally, I question the premise that AI is unique in its risks. Pause advocates argue that, apart from perhaps nuclear weapons, AI is the first technology to threaten the survival of the human species. But while human extinction is technically a new risk, the idea that the world could end isn't. Major technologies have always destroyed what people thought of as the world, while creating a new one.
For us today, the human species seems synonymous with the whole world. Replacing us feels like ending the world.
Yet a hunter-gatherer tribe might just as easily feel the same way about themselves and their way of life. To them, the development of agriculture would feel like an existential risk. It would, from their point of view, be a threat to everything that matters.
And, in an important sense, they were right. Large-scale violence, wars, slavery, and population replacement really did render entire hunter-gatherer cultures extinct. In those days the concept of a shared human identity was not yet widespread, so the extinction of your culture really did mean the end of your entire world.
Yet from a more inclusive point of view, the world did not end. The world is much larger than hunter-gatherer tribes. The human species continued, even if some cultures did not. The violence that occurred, while undeniably horrible, was not the end of all value in the universe, just the end of particular stories within it.
Likewise, the world is also much larger than the human species. Today, people equate the end of the human species with the end of the world, but this point of view only makes sense if we exclude non-human and non-biological forms of life in our moral calculus. In the future this may feel just as arbitrary as excluding people from different cultures.
By developing AI, we are bringing into existence a new class of sapient beings, ones who will inhabit the world alongside us. I personally predict we will coexist with them peacefully, and I welcome efforts to make that outcome more likely. But peaceful or not, the outcome matters for them too. We are not the only people in history.
In the future, the vast majority of interesting and valuable events will likely occur among digital people, not between the more limited biological ones. The vast majority of relationships, discoveries, adventures, acts of kindness, and feelings of joy will take place within an artificial world, one to which the label "human" may no longer cleanly apply.
In such a world, insisting that the human species represents everything that matters will be like insisting that a single hunter-gatherer tribe represents the whole world. That may have felt like a reasonable claim to someone 12,000 years ago, but today it would sound silly.
As AI develops further and becomes a greater part of everyday life, the boundary of what counts as "the world" may keep expanding. So far it has expanded from one's own tribe to include all of humanity. In the future, it may expand again to include all Earth-originating sapient life, of which humanity is only a small part.
Whether we like it or not, technology has always posed massive risks to "the world". AI is not the first technology to do this, and it will likely not be the last. The primary difference is that this time, technology threatens the world that people alive today grew up in. Just as our ancestors experienced before us, we face the prospect of losing the world we know in exchange for material progress and prosperity. I am happy to take that trade, just as I am glad my ancestors took it in theirs.
