Hi there!
I'm looking at one of Bostrom's papers (Existential Risk Prevention as Global Priority, p. 19). He includes this expected value calculation which I just can't make sense of:
"Even if we give this allegedly lower bound on the cumulative output potential of a technologically mature civilisation [he's referring to his estimate of 10^52 future lives here] a mere 1 per cent chance of being correct, we find that the expected value of reducing existential risk by a mere one billionth of one billionth of one percentage point is worth a hundred billion times as much as a billion human lives."
When trying to repeat his calculation, I reason as follows: reducing the risk of losing 10^50 expected lives by 10^-20 - that's the same as increasing the probability of getting 10^50 by 10^-20. So, it should go 10^50*10^-20 = 10^30. However, he writes that the expected value of this change is equal to 10^20 lives. It's a fairly trivial calculation, so I assume there's something obvious I've overlooked. Can you help me see what I'm missing?
Not an excuse, but maybe Bostrom was using the old British definition of "billion," rather than the American and modern British definition of billion?
Yeah, I've had the same thought. But as far as I can tell, it still doesn't add up, so I figured there must be something else going on. Thanks for your reply, though.