Hide table of contents

Summary

  • Nick Bostrom's argument is that, in order to maximize the length of the cosmic endowment, we need to minimize existential risks, instead of speeding up expansion.
  • The argument above assumes that "Maximizing the length of the cosmic endowment" is almost equivalent to "Maximizing the cosmic endowment".
  • My argument (Cosmic's Mugger, or CM) is that there are cases where these two sentences are completely different. I'll describe a theoretical case where we would accept a 49.99% chance of doom just in order to speed up expansion by one second. This is a case where the quantity of good achievable increases exponentially over the length of the cosmic endowment (In this precise case, the quantity of good achievable with t seconds of cosmic endowment would be 2^t).
  • I'll present counter-arguments saying that computation has a physical limit.
  • I'll then take an example where the quantity of good achievable doesn't depend on the number of computations, making the previous point less useful.
  • Finally, I'll ask whether CM is a Pascal's mugging.

 

Nick Bostrom's argument

Firstly, let's measure the quantity of computronium we could tile the universe with. I'm assuming that computronium will expand spherically at the speed of light, although the counterfactual could be really important. The volume of computronium is a function V which takes as an input the time in seconds the cosmic endowment lasts. Using the equation of the volume of a sphere compared to its radius, we have :

 

 

Next, we want to calculate the probability of doom p(t) we are willing to accept to increase t by one (we would therefore have 0 volume with p(t) credence, and V(t+1) volume with 1-p(t) credence). If we don't accept the risk, we have, with 100% certainty, V(t) volume of computronium. To calculate this probability, we do:

 

 

We can see that p(t) tends towards 0, which means that the bigger we believe the sphere would be, the less we have to make bets on it in order to make its radius 1 light-second bigger. In other words, this model says that if you think the future is wild, you need to act safely instead of speeding up cosmic expansion.

However, V(t) doesn't represent the quantity of good achievable, just the volume of computronium.

 

Cosmic's Mugger

Imagine G(t), which takes as an input the seconds the cosmic endowment lasts, and returns the quantity of good made in that time. Do we get the same conclusion ? It depends of the model.

Let's imagine that G(t) = 2^t, which means that the quantity of good doubles every time we speed up the cosmic expansion by one second.

In that case, whatever big t is, we are willing to accept a 49.99% probability of doom in order to get G(t+1) quantity of good, instead of having G(t) quantity of good with 100% certainty. In other words, if you think the amount of good achievable gets divided by two by delaying the cosmic expansion by one second, you should accept a 49.99% probability of doom in order to accelerate by one second the cosmic expansion.

 

If G(t) equals the number of computations possible

Imagine C(t), returning the number of computations possible in a cosmic endowment lasting t seconds.

If you think G(t) = C(t), and that C(t) = V(t), then G(t) doesn't grow exponentially. But what if C(t) grows faster than V(t) ?

I don't know anything about quantum mechanics, but if quantum mechanics makes so that one more qubit increases by two the total amount of computations, then the number of computations would double every millisecond (Actually, it would be way faster than that).

But I don't think it is possible. I managed to find three physical reasons for this. Note that I don't know anything about physics, therefore take what follows with a grain of salt: (By the way, thanks Wikipedia)
 

  • The Margolus–Levitin theorem : It "gives a fundamental limit on quantum computation (strictly speaking on all forms on computation). The processing rate cannot be higher than 6 × 10^33 operations per second per joule of energy"
  • Landauer's principle : "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase [...]". I'm not sure if that counter-argument works, since:
    • "A so-called logically reversible computation, in which no information is erased, may in principle be carried out without releasing any heat".
    • "while information erasure requires an increase in entropy, this increase could theoretically occur at no energy cost."
    • Also, it seems like there is a disagreement on whether or not this principle is valid.
  • Bremermann's limit :  It is a "limit on the maximum rate of computation that can be achieved in a self-contained system in the material universe." This limit is approximately 1.36 × 10^50 bits per second per kilogram.

 

If G(t) doesn't depend on the number of computations possible

What if the true morality were "Maximize this counter"? In this case, G(t) would grow exponentially, since a few atoms could add one binary digit to the counter, which would double the amount of good done.

Quantifying the probability that G(t) grows exponentially over time seems really hard. But we would all agree that it is not zero. But if G(t) really grows that fast, and that we decide to speed up the cosmic expansion, then the future would be way, way, way bigger than if we tried to prevent existential risks. So, unless you put a really, really, really small probability of CM being true, then you should act according to CM.

 

Is this a Pascal's mugging?

Humans are bad at giving probabilities really close to 0 or 1. Therefore, unless we are very sure of our probability (like the probability of doom by an asteroid, where we can model their orbit), probabilities really close to 0 shouldn't be taken into account.

It is what probably causes Pascal's mugging : We constantly overestimate the probability of very unlikely events, therefore we can be mugged really easily if we try to act according to the expected value.

But is the probability of CM really that small to be considered as a Pascal's mugging? I don't know. That's why I'm writing this post. Do you think it is a Pascal's Mugging?

 

A Pascal's mugging against the Cosmic's Mugger

To get a clue about the answer, we can try to create a Pascal's mugging that seems as likely as the Cosmic's Mugger.

Let's imagine that G(t) grows exponentially.

If time travel is possible, then the best strategy would be to minimize existential risks in order to maximize the odds that we start the cosmic expansion at the beginning of the universe.

Since that cosmic endowment would be way, way, way bigger than if time travel isn't possible, and since time travel is a possibility, then we should minimize existential risks. Can you spot the mugging? Do you think the probability of "time travel is possible" is smaller than the probability of "G(t) grows exponentially"?
 

10

0
0

Reactions

0
0

More posts like this

Comments1
Sorted by Click to highlight new comments since: Today at 8:56 AM

Hi Lysandre,

I really enjoyed the post!

Nick Bostrom's argument is that, in order to maximize the length of the cosmic endowment, we need to minimize existential risks, instead of speeding up expansion.

Another way of arguing agaist this is claiming that we do not know whether the future is positive or negative, so making it larger has unclear effects.

If G(t) doesn't depend on the number of computations possible

Personally, I think the probability of this being true is sufficiently low to be negligible.