Astronomical waste is the loss of potential value resulting from delaying the efficient exploitation of the universe's resources. The term and the concept expressed by it were introduced by Nick Bostrom in a seminal paper (Bostrom 2003).paper.[1]
In relative terms, however, the costs may be quite modest. The cosmos has existed for about 10 billion years, so one should not antecedently expect cosmological processes to cause value to decay by more than 1 part in 10 billion or so per year. And the observational evidence appears to be consistent with this prior assessment. The finitude, expansion, and burndown of the universe seem all to be occurring at a slow enough rate as to be in line with the estimate based on the duration of the universe so far (Christiano 2013).far.[2]
Astronomical waste is often cited as a consideration in favor of longtermism. When authors talk about "astronomical waste" in these contexts, however, what they typically mean by that phrase are not the costs of delayed expansion, but the costs of failed (or flawed) expansion. Thus, Carl Shulman mentions "the expected Astronomical Waste if humanity were rendered extinct by a sudden asteroid impact." (Shulman 2012)[3] Similarly, linking to Bostrom's paper, Gwern Branwen writes that "human extinction represents the loss of literally astronomical amounts of utility." (Branwen 2020)[4] And Siebe Rozendal writes that "Extinction would be an ‘astronomical waste’." (Rozendal 2019; see also Dai 2014, Lewis 2018, and Kristoffersson 2020)[5][6] The expression astronomical stakes (Bostrom 2015; Wiblin 2016)[7][8] may be used to express this idea, while reserving astronomical waste to refer to the opportunity costs of delayed technological development.
Bostrom, Nick (2015) Astronomical stakes, Effective Altruism Global, November 25.
Branwen, Gwern (2020) Optimal existential risk reduction investment, Gwern.net, May 28.
Dai, Wei (2014) Is the potential astronomical waste in our universe too small to care about?, LessWrong, October 21.
Kristoffersson, David (2020) The ‘far future’ is not just the far future, Effective Altruism Forum, January 16.
Lewis, Gregory (2018) The person-affecting value of existential risk reduction, Effective Altruism Forum, April 13.
Rozendal, Siebe (2019) Eight high-level uncertainties about global catastrophic and existential risk, Effective Altruism Forum, November 28.
Shulman, Carl (2012) Are pain and pleasure equally energy-efficient?, Reflective Disequilibrium, March 24.
Wiblin, Robert (2016) Making sense of long-term indirect effects, Effective Altruism, August 7
Bostrom, Nick (2003) Astronomical waste: the opportunity cost of delayed technological development, Utilitas, vol. 15, pp. 308–314.
Christiano, Paul (2013) Astronomical waste, Rational Altruist, April 30.
Shulman, Carl (2012) Are pain and pleasure equally energy-efficient?, Reflective Disequilibrium, March 24.
Branwen, Gwern (2020) Optimal existential risk reduction investment, Gwern.net, May 28.
Rozendal, Siebe (2019) Eight high-level uncertainties about global catastrophic and existential risk, Effective Altruism Forum, November 28.
See also Wei Dai (2014) Is the potential astronomical waste in our universe too small to care about?, LessWrong, October 21, Gregory Lewis (2018) The person-affecting value of existential risk reduction, Effective Altruism Forum, April 13, and David Kristoffersson (2020) The ‘far future’ is not just the far future, Effective Altruism Forum, January 16.
Bostrom, Nick (2015) Astronomical stakes, Effective Altruism Global, November 25.
Wiblin, Robert (2016) Making sense of long-term indirect effects, Effective Altruism, August 7.
Astronomical waste is often cited as a consideration in favor of longtermism. When authors talk about "astronomical waste" in these contexts, however, what they typically mean by that phrase are not the costs of delayed expansion, but the costs of failed (or flawed) expansion. Thus, Carl Shulman mentions "the expected Astronomical Waste if humanity were rendered extinct by a sudden asteroid impact." (Shulman 2012) Similarly, linking to Bostrom's paper, Gwern Branwen writes that "human extinction represents the loss of literally astronomical amounts of utility." (Branwen 2020) And Siebe Rozendal writes that "Extinction would be an ‘astronomical waste’." (Rozendal 2019; see also Dai 2014, Lewis 2018, and Kristoffersson 2020) The expression astronomical stakes (cf.(Bostrom 2015; Wiblin 2016) may be used to express this idea, while reserving astronomical waste forto refer to the concept in Bostrom's original paper. opportunity costs of delayed technological development.
Bostrom, Nick (2015) Astronomical stakes, Effective Altruism Global, November 25.
If the opportunity costs of delaying the exploitation of the universe's resources are so low in relative terms, however large they may be in absolute terms, it follows that such costs are unimportant relative to the costs arising from exposure to existential risk, which are much higher in comparison. Over the next decade, maybe a billionth of total attainable value will be lost as a result of failing to arrange the universe optimally. Over that same decade, perhaps a thousandth of this value will be lost in expectation from exposure to a 0.1 percent risk of an existential catastrophe. The costs from existential risk exposure thus appear to exceed the opportunity costs from delayed expansion by several orders of magnitude.
Astronomical waste is often cited as a consideration in favor of longtermism. When authors talk about "astronomical waste" in these contexts, however, what they typically mean by that phrase are not the costs of delayed expansion, but the costs of failed (or flawed) expansion. Thus, Carl Shulman mentions "the expected Astronomical Waste if humanity were rendered extinct by a sudden asteroid impact." (Shulman 2012) Similarly, linking to Bostrom's paper, Gwern Branwen writes that "human extinction represents the loss of literally astronomical amounts of utility." (Branwen 2020) And Siebe Rozendal writes that "Extinction would be an ‘astronomical waste’." (Rozendal 2019; for further examples, see also Dai 2014, Lewis 2018, and Kristoffersson 2020 and Dai 2014)2020) The expression astronomical stakes (cf. Wiblin 2016) may be used to express this idea, while reserving astronomical waste for the concept in Bostrom's original paper.
Lewis, Gregory (2018) The person-affecting value of existential risk reduction, Effective Altruism Forum, April 13.
The accessible universe is vast, and virtually all of it remains unexploited. The Virgo Supercluster contains 1013 stars, and the energy of each star could power 1042 computations per second. The human brain can perform about 1017 computations per second. Assuming that the morally relevant properties of the brain—such as phenomenal consciousness—supervene on its functional organization, it follows that the universe could support, every second, an amount of value equivalent to that realized in 1013times×1042div÷1017=1028 human lives. The moral costs of failing to actualize this potential thus appear to be enormous.
The accessible universe is vast, and virtually all of it remains unexploited. The Virgo Supercluster contains 1013 stars, and the energy of each star could power 1042 computations per second. The human brain can perform about 1017 computations per second. Assuming that the morally relevant properties of the brain—such as phenomenal consciousness—supervene on its functional organization, it follows that the universe could support, every second, an amount of value equivalent to that realized in 1013×times1042÷div1017=1028 human lives. The moral costs of failing to actualize this potential thus appear to be enormous.
Thus, although upon first noticing the astronomical costs of delayed technological development an altruist may be tempted to conclude that such development should be hastened, that conclusion does not survive careful reflection. Because the bulk of existential risk is posed by anthropogenic existential risks from new technologies, accelerating the development of new technology will itself have major effects on existential risk. Such effects will dwarf any gains from reduction of astronomical waste, and should therefore be the primary consideration for decision making.
Astronomical waste is the loss of potential value resulting from delaying the efficient exploitation of the universe's resources.resources. The term and the concept expressed by it were introduced by Nick Bostrom in a seminal paper (Bostrom 2003).
In relative terms, however, the costs may be quite modest. The cosmos has existed for about 10 billion years, so we shouldn'tone should not antecedently expect cosmological processes to cause value to decay by more than 1 part in 10 billion or so per year. And the observational evidence appears to be consistent with this prior assessment. The finitude, expansion, and burndown of the universe seem all to be occurring at a slow enough rate as to be in line with the estimate based on the duration of the universe so far (Christiano 2013).
If the opportunity costs of delaying the exploitation of the universe's resources are so low in relative terms, however large they may be in absolute terms, it follows that such costs are unimportant relative to the costs arising from exposure to existential risk, which are much higher in comparison. Over the next decade, maybe a billionth of total attainable value will be lost as a result of failing to arrange the universe optimally. Over that same decade, perhaps a thousandth of this value will be lost in expectation from exposure to a 0.1 percent risk of an existential catastrophe. The costs from existential risk thus appear to exceed the opportunity costs by several orders of magnitude.
Thus, although upon first noticing the astronomical costs of delayed technological development an altruist may be tempted to conclude that such development should be hastened, that conclusion does not survive careful reflection. Because the bulk of existential risk is posed by anthropogenic existential risks from new technologies, accelerating the development of new technology will itself have major effects on existential risk. Such effects will dwarf any gains from reduction of astronomical waste, and should therefore be the primary consideration for decision making.
Astronomical waste is often cited as a consideration in favor of longtermism. When authors talk about "astronomical waste" in these contexts, however, what they typically mean by that phrase are not the costs of delayed expansion, but the costs of failed (or flawed) expansion. Thus, Carl Shulman mentions "the expected Astronomical Waste if humanity were rendered extinct by a sudden asteroid impact." (Shulman 2012) Similarly, linking to Bostrom's paper, Gwern Branwen writes that "human extinction represents the loss of literally astronomical amounts of utility." (Branwen 2020) And Siebe Rozendal writes that "Extinction would be an ‘astronomical waste’." (Rozendal 2019; for further examples, see Kristoffersson 2020;2020 and Dai 2014) The expression astronomical stakes (cf. Wiblin 2016) may be used to express this idea, while reserving astronomical waste for the concept in Bostrom's original paper.
moral perspectives ondifferential progress | ethics of existential risk reduction | space colonization | speeding up development
Astronomical waste is the loss of potential value
resultingfrom delaying the efficient exploitation of the universe's resources. The term and the concept expressed by it were introduced by Nick Bostrom in a seminal paper.[1]In relative terms, however, the costs may be
quitepretty modest. The cosmos has existed for about 10 billion years, so one should not antecedently expect cosmological processes to cause value to decay by more than 1 part in 10 billion or so per year. And the observational evidence appears to be consistent with this prior assessment. The finitude, expansion, and burndown of the universe seem all to be occurring at a slow enough rateasto be in line with the estimate based on the duration of the universe so far.[2]If the opportunity costs of delaying the exploitation of the universe's resources are so
lowsmall in relative terms, however large they may be in absolute terms, it follows that such costs are unimportant relative to the costs arising from exposure to existential risk, which are much higher in comparison. Over the next decade, maybe a billionth of total attainable value will be lostas a result of failingdue to failure to arrange the universe optimally. Over that same decade, perhaps a thousandth of this value will be lost in expectation from exposure to a 0.1 percent risk of an existential catastrophe. The costs from existential risk exposure thus appear to exceed the opportunity costs from delayed expansion by several orders of magnitude.Thus, although upon first noticing the astronomical costs of delayed technological development an altruist may be tempted to conclude that such development should be hastened, that conclusion does not survive careful reflection. Because
the bulk of existential risk is posed byanthropogenic existential risks from newtechnologies,technologies pose the bulk of existential risk, accelerating the development of new technology will itself havemajorsignificant effects on existential risk. Such effects will dwarf any gains fromreduction ofreducing astronomical waste, and should therefore be the primary consideration fordecisiondecision-making.A note about terminologyTerminology