This is a special post for quick takes by Jobst Heitzig (vodle.it). Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since: Today at 9:43 AM

As every month, I let others decide collectively where I will donate. If you want to participate: http://demo.vodle.it

I'd be interested in an overview of everone's opinion on a thought experiment related to measuring the amount of doing good. I have set up a vodle poll for this: go to poll

Fairness vs Efficiency?

Assume for $100 charity A can increase one person's subjective wellbeing from 0 to 1, while charity B can increase one person's subjective wellbeing from 8 to 10.

It seems that according to EA, we should donate to B since that increases total wellbeing more than A and is thus more efficient. But then we help happy people rather than miserly people, which seems unfair.

Can one value (efficiency) really completely top another value (fairness) or should there be some compromise or tradeoff? Like giving something to A and more to B? Or the other way around?

You are somewhat confusing EA with utilitarianism. Most forms of utilitarianism don't value fairness (although you can have a utilitarian framework that weights less happy people more, if you want), except in so forth that it increases utility and this is what you are identifying. 

EA doesn't explicitly have a utilitarian framework, or any framework (the EA movement is a nebulous object). However most EA's have moral framework's similar to utilitarianism, which is why they would give to charity B. 

As for whether one can let one value trump all others, well, people can do whatever they want within the constraints of what physics and society will allow. Whether they Should is another question. If it feels more right to you to give some money to charity A, then perhaps you don't have fully utilitarian values. Or maybe upon reflection you will realize you would in fact give to charity B. Either way, the question you are asking is a subjective one(in my opinion, people disagree about the truth of moral anti-realism). 

You may also want to look at writings on moral uncertainty, which gets at the idea of giving some weight to multiple value systems. 

Dear @Charlie_Guthmann, sorry for correcting you, but I'm not confusing EA with utilitarianism at all.

Indeed, I consider myself a utilitarian and I would give to A rather than B. Because in my version of utilitarianism the relationship between subjective wellbeing and utility is nonlinear and I would consider an increase from 0 to 1 subjective wellbeing as a larger increase in utility than one from 8 to 10.

My trouble is that a recent talk at the EAGxVirtual conference by someone from the Happier Lives Institute seemed to suggest that one should measure the amount of doing good by adding up increases in subjective wellbeing (rather than utility), and that that would lead us giving to B, which to me seems a little unfair.

Don’t feel sorry about correcting me, I appreciate the discourse. However I still disagree with your original comment. You originally said “according to EA…” EA doesn’t prescribe whether or not you should have linear/log/etc utility function with respect to subjective well being. EA is not a monolith. That’s simply what someone at the happier lives institute seems to believe, or implied without realizing what they were implying. There are definitely people in the community who share your views.

I’ll agree my original statement was lacking nuance.

I’m receptive to the idea that most of the people in ea do xyz, or EA in so fourth as it is made up of certain institutions seems to act or speak as as if it’s values are xyz, but the phrasing “according to ea” gives me the sense that you think there is some normative Bible of EA, which for better or worse there isn’t. If you meant one of the above statements, then I’m in agreement. Large swaths of people who identify as EAs say things that basically equate a very simple/specific form of utilitarianism with doing good. This troubles me as well.

Thanks for this clarification. I was sloppy originally when saying "it seems that according to EA, ..." instead of saying "it seems that according to prominent members of the EA community, ...".

So do you know of any statistics on what the values of people who identify with the EA community actually are, and what they would say to such questions as mine?

I looked briefly but wasn't able to find anything. There is the Demographic Survey also, but the level of specificity re moral views is not going to help you much. 

EA Polls Facebook group <- I don't think they have the poll you are looking for but I would recommend posting your question here if you want more responses. Also Short-form posts generally don't get much engagement, and the engagement they do get is often by the most committed EAs, so if you really care you might consider making a top level post, though that is understandably more stressful. 

Thanks, Charlie. Unfortunally I boycott Faceb**k, but I will consider making a top-level post, on some thought experiments like this one.