I have a question for the EA community about the terms 'X-risk' and 'S-risk', and terms for the same kinds of risk that occur with lower stakes.

The Centre for the Study of Existential Risks front page states 'We are dedicated to the study and mitigation of risks that could lead to human extinction or civilisational collapse.' Since 'existential' is defined as 'relating to existence or being alive', I think it logically follows that an existential risk is 'a risk to existence or being alive'. I believe the majority of people would understand the essence of what an existential risk is after hearing the term, without having been provided with a definition. 

The Center for Reducing Suffering states, in various places on its website, that a suffering risk or S-risk is a risk that a circumstance will cause an 'astronomical' or 'unimaginable' amount of suffering. I don't think it logically follows from the name 'suffering risk' that this term is referring to an astronomical or unimaginable amount of suffering, as opposed to some undefined amount of suffering. The word 'suffering' is not naturally superlative in the way 'existential' is, but it's elevated to this position in the term 'suffering risk'.

If I was to write about a potential future meteor impact on Earth, I could distinguish between 'the expected outcome is widespread destruction' and 'the expected outcome is the extinction of humanity' by using the terms 'catastrophic risk' or 'existential risk' respectively. But if I was to write about the factory farming of aliens, I can't think of a term that describes a risk of 'some undefined amount of suffering that is worth consideration' as opposed to 'an astronomical or unimaginable amount of suffering'. 

Is anyone aware of such a term that is currently in use? I'd really like to hear about it or be directed to resources about it or other instances of its use. Alternatively, if you think the question is stupid or there is something wrong with my definitions or interpretation, that would also be really useful information for me and I hope you will comment either way :)

9

0
0

Reactions

0
0
Comments4
Sorted by Click to highlight new comments since: Today at 6:22 PM

It's a perfectly good question! I've done research focused on reducing s-risks, and I still don't have a perfectly clear definition for them. 

I generally use the term for suffering that occurs on an astronomical scale and is enough to make the value of the future negative. So for the alien factory farming, I'd probably call it an s-risk once the suffering of the aliens outweighs the positive value from other future beings. If it was significant, but didn't rise to that level, I'd call it something like 'catastrophic suffering risk'. 'Astronomical waste' is also a term that works, though I usually use that for positive things we fail to do, rather than negative things we do. 

Overall, I wouldn't worry too much. There isn't standard terminology for 'undefined amount of suffering that deserves consideration', and you should be fine using whatever terms seem best to you as long as you're clear what you mean by them. The demarcation between existential and merely catastrophic risks is important, because there is a sharp discontinuity once a risk becomes so severe that we can never recover from it. There isn't anything like that with s-risks; a risk that falls just under the bar for being an s-risk should be treated the same as a risk that just passes it.

I hope that answered your question! I'd be happy to clarify if any of that was unclear, or if you have further questions.

Thanks so much for this useful reply :) There's something I want to write about which I believe carries a risk of 'some undefined amount of suffering that is worth consideration'. I think I have been wasting time trying to decide if it is an s-risk by the usual definitions rather than just writing about what's happening and then speculating from there. You make a good point that something not quite bad enough to be an s-risk is still pretty bad!

I do think 'catastrophic suffering risk' is an odd one, because it's really not intuitive that a 'catastrophic suffering risk' is less bad than a 'suffering risk'. I guess I just find it weird that something as bad as a genuine s-risk has such a pedestrian name, compared to 'existential risk', which I think is an intuitive and evocative name that gets across the level of bad-ness pretty well. 
 

One quick question - when you say an s-risk creates a future with negative value, does that make it worse than an x-risk? As in, the imagined future is SO awful that the extinction of humanity would be preferable?
 

I do think 'catastrophic suffering risk' is an odd one, because it's really not intuitive that a 'catastrophic suffering risk' is less bad than a 'suffering risk'. I guess I just find it weird that something as bad as a genuine s-risk has such a pedestrian name, compared to 'existential risk', which I think is an intuitive and evocative name that gets across the level of bad-ness pretty well. 

I think what happens in my head is that 's-risk' denotes a similarity to x-risks while 'catastrophic suffering risk' denotes a similarity to catastrophic risks, making the former feel more severe than the latter, but I agree this is odd.

One quick question - when you say an s-risk creates a future with negative value, does that make it worse than an x-risk? As in, the imagined future is SO awful that the extinction of humanity would be preferable?

Yep, for me that feels like a natural place to put the bar for an s-risk.

Yep, for me that feels like a natural place to put the bar for an s-risk.

Hi, just saw this thread. I’m curious what type of mechanisms could lead to a net-negative world in your opinion?

Curated and popular this week
Relevant opportunities