Note: if this video doesn't load, please refresh the page

@benthamite #negativeutilitarianism #effectivealtruism ♬ original sound - Benthamite


Negative utilitarians believe that we should prioritize the alleviation of suffering over the creation of happiness, for example claiming that one unit of suffering is outweighed only by two units of happiness.

Toby Ord argues that this is incoherent because there are no natural units in which to measure happiness and suffering, and therefore it's unclear what it even means to put them on the same scale.

I disagree. Consider taking a certain number of elementary particles and using them to create the happiest brain possible; call that one unit of happiness. Use the same number of particles to create the brain experiencing the most suffering possible; call that one unit of suffering.

Elementary particles therefore let us derive an “exchange rate” between happiness and suffering, and it's coherent to talk about trade-offs between the two.




Sorted by Click to highlight new comments since:

Toby Ord argues that this is incoherent because there are no natural units in which to measure happiness and suffering, and therefore it's unclear what it even means to put them on the same scale.

One problem might be that there are no natural units on which to measure happiness and suffering. Another is that there are too many. If there are a hundred thousand different ways to put happiness and suffering on the same scale and they all differ in the exchange rate they imply, then it seems you've got the same problem. Your example of comparisons in terms of elementary particles feels somewhat arbitrary, which makes me think this may be an issue.

This is a good point, I agree

If amount of happiness (or suffering) possible is not linear in the number of elementary particles, what number of elementary particles do you suggest using?

I am not sure, and I think this implies a good objection to my suggestion

Note: TikTok embeds don't seem to work fully here – if you can't see the video, try refreshing the page.

What exchange rate could you infer from the happiest N particles and the most suffering N particles? I feel like a step is missing. Are you assuming they're equally great in absolute value?


I consider it an open question how functionally similar suffering and pleasure are, e.g. if the functions for suffering are all identical or opposite/symmetric to the functions for pleasure, in bijection, even in minds optimized for each.

Thanks for the question!

I'm merely claiming that statements like "one unit of suffering is worth two units of happiness" are coherent because "unit" can be defined in reference to particles. I'm not claiming that any particular ratio is correct, only that it's coherent to talk about ratios other than 1:1.

Note: I couldn't find anyone making this specific response to Toby anywhere, but the basic idea has been around for quite some time, maybe originating in this 2012 blog post from Carl.

I am not familiar with this particular domain, although I know what utilons are, so uh... if this was meant for me, this was not immediately convincing? Or elucidating??

Play by play of my gut reactions: (this is for the sake of imagining what strangers might think, not meant to be taken as serious criticism)

"Negative utilitarians:" okay this is some kinda obscure philosophy thing, isn't it. I'd probably skip it if this were interrupting my fun tiktok videos, but I want to know what an EA tiktok looks like.

"Graph that doesn't illustrate anything I understand immediately" okay, this is about math, hopefully that's all I need to get. 

"1 unit of suffering , 1 unit of happiness" okay, simple, cool, lets figure things out! 

1" unit of suffering per 2 units of happiness" uh kinda weird, but okay. Where does this lead. 

"elementary particles" ???  

"elementary particles, happy brain, sad brain, elementary particles, therefore measure"  NO. This is dumb and jargon obfuscation of nonsense.

Afterwards: (still stream-of-conscious reaction) Why does introducing "elementary particles" have anything to do with measuring the amount of happiness and suffering? That doesn't solve anything! Why are we trying to use the same set of particles to create a happy brain and a sad brain? Are you saying that if it takes more particles to make a happy brain then its worth more sad brains? Isn't it the arrangement that matters?  What? What???

I assume there is a lot more to what you are getting at and that you have a very good theory here! But this part didn't capture the vital bit I need to understand the basic concept and why it works. (Or clearly understand what you are driving at.) I would say it needs some rephrasing, maybe the context doesn't matter as much to stating your concept? "Happiness is hard to measure" might be enough?

Thanks! I appreciate the detailed feedback.

Curated and popular this week
Relevant opportunities