J

jonathanstray

83 karmaJoined Aug 2015

Comments
15

So here we go. EA's do not generally think seriously about political action. Is it time?

Really glad you did this. I see some similarities with my work as a journalist. I've previously argued that journalism has never attempted systematic evaluation of government, e.g. department by department, so it's fantastic to see someone attempt this. Your problems regarding domain knowledge, slow or unhelpful responses from officials, inconsistent transparency, etc. are spot on and well known to reporters. Keep up the good work!

Several of these might be summed up under the heading "high risk." There is a notion that this is exactly what philanthropy (as opposed to governments) ought to be doing.

One area I think hits many of these: global income inequality.

Well, Russell believed it could be developed through education. One exercise which can help is comparing an abstract number of people to something that relates to daily experience, such as the number of people in your school or your city.

Here's a similar scale which was developed to communicate risk values

http://imgur.com/nLXDg1h

MacAskil discusses this in a section titled "international labor mobility" but does not mention "open borders" or draw the distinction you have. He writes:

"Increased levels of migration from poor to rich countries would provide substantial benefits for the poorest people in the world, as well as substantial increases in global economic output. However, almost all developed countries pose heavy restrictions on who can enter the country to work. ... Tractability: Not very tractable. Increased levels of immigration are incredibly unpopular in developed countries, with the majority of people in Germany, Italy, the Netherlands, Norway, Sweden, and the United Kingdom favoring reduced immigration."

In "Doing Good Better" MacAskil rates labor mobility as "intractable." I agree it's difficult, but I think this a specific example of the wide blindness of EA to the mechanics of political change. All of the issues you have raised are fundamentally political problems, not technical problems, and would require political strategies, for which we will not have evidence from RCTs.

This is a weakness of the "progressive" philanthropic tradition in general, which tends to think in terms of technical solutions to specific problems. It has a lot less to say about the broader shifts in values and networks that enable high level political change

More on that: http://www.insidephilanthropy.com/home/2015/7/22/is-too-much-funding-going-to-social-entrepreneursand-too-lit.html

In other words, I am glad to see this post. I think we need to be looking in these sorts of directions.

"The fact there seems to be missing the way by which effective altruism determines which moral goals are worth pursuing ... That seems to be the role of meta-ethics in effective altruism."

Maybe the answer is not to be found in meta-ethics or in analysis generally, but in politics, that is, the raw realities of what people believe and want any given moment, and how consensus forms or doesn't.

In other words, I think the answer to "what goals are worth pursuing" is, broadly, ask the people you propose to help what it is they want. Luckily, this happens regularly in all sorts of ways, including global scale surveys. This is part of what the value of "democracy" means to me.

A man named Horst Rittel -- who also coined "wicked problem" -- wrote a wonderful essay on the relationship between planning for solving social problems and politics which seems appropriate here http://www.cc.gatech.edu/~ellendo/rittel/rittel-reasoning.pdf

tl;dr some kinds of knowledge are instrumental, but visions for the future are unavoidably subjective and political.

"EA has very different needs than much of the non-profit world." In what way?

I also have to say that there is something very insider-y about this analysis. Much of the advice seems like it boils down to "don't waste your time with non-EA people."

If I understand you correctly I think you make two interesting points here:

  • the potential of EA as a political vehicle for financial charity

  • The current EA advice has to be the marginal advice

When I wrote "isn't that the fundamental claim of EA" I suppose more properly I am referring to the claims that 1) EA is a suitable moral philosophy 2) the consensus answers in the real existing EA community correspond to this philosophy. In other words that EA is, broadly speaking, "right" to do.

Yes. But then, shouldn't all arguments about what is appropriate for EA's to do generalize to what it is appropriate for everyone to do? Isn't that the fundamental claim of the EA philosophy?

Load more