Hide table of contents

Bostrom’s new book is out today in hardcover and Kindle in the USA, and on Kindle in the UK.


A greyhound catching the mechanical lure—what would he actually do with it? Has he given this any thought?

Bostrom’s previous book, Superintelligence: Paths, Dangers, Strategies changed the global conversation on AI and became a New York Times bestseller. It focused on what might happen if AI development goes wrong. But what if things go right?

Suppose that we develop superintelligence safely, govern it well, and make good use of the cornucopian wealth and near magical technological powers that this technology can unlock. If this transition to the machine intelligence era goes well, human labor becomes obsolete. We would thus enter a condition of "post-instrumentality", in which our efforts are not needed for any practical purpose. Furthermore, at technological maturity, human nature becomes entirely malleable.

Here we confront a challenge that is not technological but philosophical and spiritual. In such a solved world, what is the point of human existence? What gives meaning to life? What do we do all day?

Deep Utopia shines new light on these old questions, and gives us glimpses of a different kind of existence, which might be ours in the future.

Links to purchase:

Deep Utopia by Nick Bostrom

There’s a table of contents on the book’s web page.




Sorted by Click to highlight new comments since:

Anyone know if there'll be an audiobook?

Audiobook version is "in the works", coming "probably in a few months": https://youtu.be/KOHO_MKUjhg?feature=shared&t=2997

I'm wondering what Nick Bostrom's p(doom) currently is, given the subject of this book. He said 9 years ago in his lecture on his book Superintelligence "less than 50% risk of doom". In this interview 4 months ago he said that it's good there has been more focus on risks in recent times, but there's still slightly less focus on the risks than what is optimal, but he wants to focus on the upsides because he fears we might "overshoot" and not build AGI at all which would be tragic in his opinion. So it seems he thinks the risk is less than it used to be because of this public awareness of the risks.

For anyone for whom it’s unavailable, Closer to Truth just released a podcast with him talking about it.

Looks like the UK hardcover release isn't until 21 May, but it's available on Kindle? Is that right? 

I think so. I'll put a note about this at the top of the post.

Curated and popular this week
Relevant opportunities