Searching for life on Mars @ Imperial College London
Lead of the Space Generation Advisory Council, Cosmic Futures project.
Interested in: Space Governance, Great Power Conflict, Existential Risk, Cosmic threats, Academia, International policy
Chilling the f*** out is the path to utopia
If you'd like to chat about space governance or existential risk please book a meeting!
Wanna know about space governance? Then book a meeting!! - I'll get an email and you'll make me smile because I love talking about space governance :D
Hi Joseph, glad you found the post interesting :)
Yeah, for "the way forward" section I explicitly assume that alien civilisations have not already developed. This might be wrong, I don't know. One possible argument in line with my reasoning around galactic x-risks is that aliens don't exist because of the Anthropic principle - if they had already emerged then we would have been killed a long time ago, so if we exist then it's impossible for aliens civilisations to have emerged already. No alien civilisations exist for the same reason that the fine structure constant allows biochemistry.
I'm not sure if any of these galactic level existential risks are tractable in any meaningful way at our current level of development. Maybe we should take things one step at a time?
I totally agree with this statement. I have huge uncertainty about what awaits us in the long-term future (in the post I compared myself to an Ancient Roman trying to predict AI alignment risks). But it seems that, since the universe is currently conducive to life, the unknowns may be more likely to end us than help us. So the main practical suggestion I have is that we take things one step at a time and hold off on interstellar travel (which could plausibly occur in the next few decades) until we know more about galactic x-risks and galactic governance.
It's not necessarily that these galactic-scale considerations will happen soon or are tractable, but that we might begin a series of events (i.e., the creation of self-propagating spacefaring civilisation) that interferes with the best possible strategy for avoiding them in the long-term future. I don't claim to know what that solution is, but I suggest some requirements a governance system may have to meet.
Yeah that's true.
I think 1000 is where I would start to get very worried intuitively, but there would be hundreds of millions of habitable planets in the Milky Way, so theoretically a galactic civilisation could have that many if it didn't kill itself before then.
I guess the probability of one of these civilisations initiating an s-risk or galactic x-risk would just increase with the size of the galactic civilisation. So the more that humanity expands throughout the galaxy, the greater the risk.
Yeah sure, it's like the argument that if you get infinite chimpanzees and put them in front of type writers, then one of them would write Shakespeare. If you have a galactic civilisation, it would be very dispersed and most likely each 'colony' occupying each solar system would govern itself independently. So they could be treated as independent actors sharing the same space, and there might be hundreds of millions of them. In that case, the probability that one of those millions of independent actors creates astronomical suffering becomes extremely high, near 100%. I used digital sentience as an example because its the risk of astronomical suffering that I see as the most terrifying - like IF digital sentience is possible, then the amount of suffering beings that it would be possible to create could conceivably outweigh the value of a galactic civilisation. That 'IF' contains a lot of uncertainty on my part.
But this also applies to tyrannous governments, how many of those independent civilisations across a galaxy will become tyrannous and cause great suffering to their inhabitants? How many of those civilisations will terraform other planets and start biospheres of suffering beings?
The same logic also applies to x-risks that affect a galactic civilisation:
all it takes is one civilization of alien ass-hat griefers who send out just one Von Neumann Probe programmed to replicate, build N-D lasers, and zap any planet showing signs of technological civilization, and the result is a galaxy sterile of interplanetary civilizations until the end of the stelliferous era (at which point, stars able to power an N-D laser will presumably become rare). (Charlie Stross)
Stopping these things from happening seems really hard. It's like a galactic civilisation needs to be designed right from the beginning to make sure that no future colony does this.
Do you expect to be more of a mentor or a mentee?
I'm very active in space governance and I'm excited to chat about how that crosses over with many other EA cause areas.
Link to my swapcard
Elon Musk recently presented SpaceX's roadmap for establishing a self-sustaining civilisation on Mars (by 2033 lol). Aside from the timeline, I think there may be some important questions to consider with regards to space colonisation and s-risks:
If the answer to question 3 is "Mars colony", then it's possible that creating a colony on Mars is a huge s-risk if we don't first answer question 2.
Would appreciate some thoughts.
Stuart Armstrong and Anders Sandberg’s article on expanding throughout the galaxy rapidly, and Charlie Stross’ blog post about griefers influenced this quick take.
Hey! I'm requesting some help with "Actions for Impact", it's a notion page with activities people can get involved in that take less than 30 minutes and can contribute to EA cause areas. This includes signing petitions, emailing MPs, voting for effective charities in competitions, responding to 'calls for evidence', or sharing something online. EA UK has the notion page linked on their website: https://www.effectivealtruism.uk/get-involved
It should serve as a hub to leverage the size of the EA community when it's needed.
I'm excited about the idea and I thought I'd have enough time to keep it updated and share it with organisations and people, but I really don't. If the idea sounds exciting and you have an hour or two per week spare please DM me, I'd really appreciate a couple of extra hands to get the ball rolling a bit more (especially if you have involvement in EA community building as I don't at all).
I didn't write this post with the intention of criticising the importance of space governance, so I wouldn't go as far as you. I think reframing space governance in the context of how it supports other cause areas reveals how important it really is. But space governance also has its own problems to deal with, so it's not just a tool or a background consideration. Some (pressing) stuff that could be very bad in the 2030s (or earlier) without effective space governance:
Edit: changed the title from "Galactic x-risks: Obstacles to accessing the cosmic Endowment" to "Interstellar travel will probably doom the long-term future". I wanted to highlight the main practical suggestion from the post :)