Dystopia

Pablo (+9/-9)
Pablo (+23/-47)
Leo (+4)
Pablo (+50/-78)
Pablo (-6)
Pablo (+12/-17)
Pablo (+31/-51)
Pablo (+34/-24)
Leo (+789/-769)
Pablo

Enforced dystopias are the most familiar type of dystopia. In fiction, they are most prominently represented by George Orwell's Nineteen-Nineteen Eighty-Four: "If you want a picture of the future, imagine a boot stamping on a human face— for ever."[2] Outside of fiction, North Korea arguably offers an example of a stable local dystopia.[3][4] Fictional and real enforced dystopias often assume the form of robust totalitarianism, though this need not be so.

Enforced dystopias are the most familiar type of dystopia. In fiction, they are most prominently represented by George Orwell's Nineteen-Eighty Eighty-Four: "If you want a picture of the future, imagine a boot stamping on a human face— for ever."[2] Outside of fiction, North Korea arguably offers an example of a stablelocal stable dystopia.[3][4] Fictional and real enforced dystopias often assume the form of robust totalitarianism, though this need not be so.

Superficially, undesired dystopias may appear unlikely. If no one desires a world, why should we expect it to exist? The answer relates to the mismatch that can sometimes occur between individual and collective preferences: what is rational for each person may be irrational for all people. It may be best for each individual to consume resources without restraint, regardless of what the other individuals do; but if everyone acts in this manner, the result may be resource depletion, which is worse for everyone than an alternative in which everyone moderates their consumption. Scott Alexander offers a toy example of a possible undesired dystopia.[5] Imagine a society governed by two simple rules: first, every person must spend eight hours a day giving themselves strong electric shocks; second, if anyone fails to follow either rule, everyone must unite to kill this person. The result is a world in which everyone ends up givinggives themselves electric shocks, since they know they will be killed otherwise. As Alexander summarizes, "Every single citizen hates the system, but for lack of a good coordination mechanism it endures."[5]

Just as one may wonder why undesired dystopias would exist, one may wonder why desired dystopias would be dystopian. Here a relevant example has been provided by Nick Bostrom.[6][7] Mass outsourcing to either digital uploads or AI agents could eventually result in a world entirely devoid of phenomenal consciousness. This could happen if it turned out that conscious states could not be instantiated in silico. It could also happen if, in this radically new environment, consciousness was selected against due to strong evolutionary pressures. It may, for instance, be more computationally efficient to represent an agent's utility function explicitly rather than to rely on a hedonic reward architecture. On a wide range of theories, wellbeing requires consciousness (although it may not reduce to consciousness), so such a world would be devoid of moral patients, no matter how thriving it may appear to outside observers or how much the world's inhabitants may insist that they are conscious or that their lives are worth living. Bostrom describes an imagined "technologically highly advanced society, containing many sorts of complex structures, some of which are much smarter and more intricate than anything that exists today, in which there would nevertheless be a complete absence of any type of being whose welfare has moral significance. In a sense, this would be an uninhabited society. All the kinds of being that we care even remotely about would have vanished."[6] Aspects of this possible dystopian future may actually be observed at presenttoday in the lives of some non-human animals bred for human consumption.[8]

  1. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, fig. 5.2.

  2. ^

    Orwell, George (1949) Nineteen Eighty-Four: A Novel, London: Secker & Warburg, ch. 3.

  3. ^

    Drescher, Denis (2017) Cause area: Human rights in North Korea, Effective Altruism Forum, November 20.

  4. ^

    Drescher, Denis (2020) Self-study directions 2020, Impartial Priorities, June 27.

  5. ^

    Alexander, Scott (2014) Meditations on Moloch, Slate Star Codex, July 30.

  6. ^

    Bostrom, Nick (2004) The future of human evolution in Charles Tandy (ed.) Death and Anti-Death: Two Hundred Years after Kant, Fifty Years after Turing, vol. 2, Palo Alto, California: Ria University Press, pp. 339–371.

  7. ^

    Bostrom, Nick (2014) Superintelligence: Paths, Dangers, Strategies, Oxford: Oxford University Press, pp. 172-173.

  8. ^

    Liu, Yuxi (2019) Evolution “failure mode”: chickens, LessWrong, April 26.

Just as one may wonder why undesired dystopias would exist, one may wonder why desired dystopias would be dystopian. Here a relevant example has been provided by Nick Bostrom.[6][7] Mass outsourcing to either digital uploads or AI agents could eventually result in a world entirely devoid of phenomenal consciousness. This could happen if it turned out that conscious states could not be instantiated in silico. It could also happen if, in this radically new environment, consciousness was selected against due to strong evolutionary pressures. It may, for instance, be more computationally efficient to represent an agent's utility function explicitly rather than to rely on a hedonic reward architecture. On a wide range of axiological theories, valuewellbeing requires consciousness (even if value does(although it may not necessarily reduce to consciousness), so such a world would be effectively morally valueless,devoid of moral patients, no matter how thriving it may appear to outside observers or how much the world's inhabitants may insist that they are conscious or that their lives are worth living. Bostrom describes an imagined "technologically highly advanced society, containing many sorts of complex structures, some of which are much smarter and more intricate than anything that exists today, in which there would nevertheless be a complete absence of any type of being whose welfare has moral significance. In a sense, this would be an uninhabited society. All the kinds of being that we care even remotely about would have vanished."[6] Aspects of this possible dystopian future may actually be observed at present in the lives of some non-human animals bred for human consumption.[8]

Ord further subdivides unrecoverable dystopias into three types. A dystopia may be desired by either none, some, or most or allactors living under it. Ord calls these undesired dystopias, enforced dystopias, and desired dystopias, respectively.[1]

Ord further subdivides unrecoverable dystopias into three types. A dystopia may be desired by either none, some, or most or all of the actors living under it. Ord calls these undesired dystopias, enforced dystopias, and desired dystopias, respectively.[1]

Since the concept of a dystopia is defined in terms of the value absent from the world so characterized, whether something is or is not a dystopia may vary depending on the moral theory under consideration. On classical utilitarianism, for example, there is an enormous difference in value between worlds optimized for positive experience and a seemingly desirable world where everyone enjoys the quality of life of the most privileged citizens of today’s most prosperous nations. The permanent entrenchment of the latter type of world may thusthus, on that theorytheory, count as a dystopia, in the sense that most attainable value would have failed to be realized. Conversely, although Bostrom’s "unconscious outsourcers" dystopian scenario would be catastrophic on many plausible moral theories, it may not be so from the perspective of suffering-focused ethics.

Ord further subdivides unrecoverable dystopias into three types. A dystopia may be desired by either none, some, or most or all of the actors living under it. Ord calls these undesired dystopias, enforced dystopias, and desired dystopias, respectively.[2]1]

Enforced dystopias are the most familiar type of dystopia. In fiction, they are most prominently represented by George Orwell's Nineteen-Eighty Four: "If you want a picture of the future, imagine a boot stamping on a human face— for ever."[3]2] Outside of fiction, North Korea arguably offers an example of a local stable dystopia.[4]3][5]4] Fictional and real enforced dystopias often assume the form of robust totalitarianism, though this need not be so.

Superficially, undesired dystopias may appear unlikely. If no one desires a world, why should we expect it to exist? The answer relates to the mismatch that can sometimes occur between individual and collective preferences: what is rational for each person may be irrational for all people. It may be best for each individual to consume resources without restraint, regardless of what the other individuals do; but if everyone acts in this manner, the result may be resource depletion, which is worse for everyone than an alternative in which everyone moderates their consumption. Scott Alexander offers a toy example of a possible undesired dystopia.[6]5] Imagine a society governed by two simple rules: first, every person must spend eight hours a day giving themselves strong electric shocks; second, if anyone fails to follow either rule, everyone must unite to kill this person. The result is a world in which everyone ends up giving themselves electric shocks, since they know they will be killed otherwise. As Alexander summarizes, "Every single citizen hates the system, but for lack of a good coordination mechanism it endures."[6]5]

Just as one may wonder why undesired dystopias would exist, one may wonder why desired dystopias would be dystopian. Here a relevant example has been provided by Nick Bostrom.[7]6][8]7] Mass outsourcing to either digital uploads or AI agents could eventually result in a world entirely devoid of phenomenal consciousness. This could happen if it turned out that conscious states could not be instantiated in silico. It could also happen if, in this radically new environment, consciousness was selected against due to strong evolutionary pressures. It may, for instance, be more computationally efficient to represent an agent's utility function explicitly rather than to rely on a hedonic reward architecture. On a wide range of axiological theories, value requires consciousness (even if value does not necessarily reduce to consciousness), so such a world would be effectively morally valueless, no matter how thriving it may appear to outside observers or how much the world's inhabitants may insist that they are conscious or that their lives are worth living. Bostrom describes...

Read More (165 more words)

Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.Publishing, pp. 153–158.

  1. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

  2. ^

    Ord (2020)Ord, The Precipice, ch.fig. 5.2.

  3. ^

    Orwell, George (1949) Nineteen Eighty-Four: A Novel, London: Secker & Warburg, ch. 3.

  4. ^

    Drescher, Denis (2017) Cause area: Human rights in North Korea, Effective Altruism Forum, November 20.

  5. ^

    Drescher, Denis (2020) Self-study directions 2020, Impartial Priorities, June 27.

  6. ^

    Alexander, Scott (2014) Meditations on Moloch, Slate Star Codex, July 30.

  7. ^

    Bostrom, Nick (2004) The future of human evolution in Charles Tandy (ed.) Death and Anti-Death: Two Hundred Years after Kant, Fifty Years after Turing, vol. 2, Palo Alto, California: Ria University Press, pp. 339–371.

  8. ^

    Bostrom, Nick (2014) Superintelligence: Paths, Dangers, Strategies, Oxford: Oxford University Press, 172-173.

  9. ^

    Liu, Yuxi (2019) Evolution “failure mode”: chickens, LessWrong, April 26.

In Toby Ord's typology, unrecoverable dystopias constitute one of the three main types of existential catastrophe (Ord 2020).[1]

Ord further subdivides unrecoverable dystopias into three types. A dystopia may be desired by either none, some, or most or all of the actors living under it. Ord calls these undesired dystopias, enforced dystopias, and desired dystopias, respectively (Ord 2020: ch. 5).respectively.[2]

Enforced dystopias are the most familiar type of dystopia. In fiction, they are most prominently represented by George Orwell's Nineteen-Eighty Four: "If you want a picture of the future, imagine a boot stamping on a human face— for ever." (Orwell 1949: ch. 3).[3] Outside of fiction, North Korea arguably offers an example of a local stable dystopia (Drescher 2017; Drescher 2020).dystopia.[4][5] Fictional and real enforced dystopias often assume the form of robust totalitarianism, though this need not be so.

Superficially, undesired dystopias may appear unlikely. If no one desires a world, why should we expect it to exist? The answer relates to the mismatch that can sometimes occur between individual and collective preferences: what is rational for each person may be irrational for all people. It may be best for each individual to consume resources without restraint, regardless of what the other individuals do; but if everyone acts in this manner, the result may be resource depletion, which is worse for everyone than an alternative in which everyone moderates their consumption. Scott Alexander offers a toy example of a possible undesired dystopia (Alexander 2014).dystopia.[6] Imagine a society governed by two simple rules: first, every person must spend eight hours a day giving themselves strong electric shocks; second, if anyone fails to follow either rule, everyone must unite to kill this person. The result is a world in which everyone ends up giving themselves electric shocks, since they know they will be killed otherwise. As Alexander summarizes, "Every single citizen hates the system, but for lack of a good coordination mechanism it endures." (Alexander 2014)[6]

Just as one may wonder why undesired dystopias would exist, one may wonder why desired dystopias would be dystopian. Here a relevant example has been provided by Nick Bostrom (Bostrom 2004; Bostrom 2014: 172-173).[7][8] Mass outsourcing to either digital uploads or AI agents could eventually result in a world entirely devoid of phenomenal consciousness. This could happen if it turned out that conscious states could not be instantiated in silico. It could also happen if, in this radically new environment, consciousness was selected against due to strong evolutionary pressures. It may, for instance, be more computationally efficient to represent an agent's utility function explicitly rather than to rely on a hedonic reward architecture. On a wide range of axiological theories, value requires consciousness (even if value does not necessarily reduce to consciousness), so such...

Read More (331 more words)
Load more (10/45)