Opinion piece in the Guardian by Olúfẹ́mi O Táíwò (assistant professor of philosophy at Georgetown University) and Joshua Stein (postdoctoral fellow at the Georgetown Institute for the Study of Markets and Ethics)

 

Summary / Gist of the article:

  • SBF alleged fraud calls into question effective altruism. If the movement doesn’t change course, one of the most ambitious charitable drives in recent history will end up like so many others: a lab and playground for wealthy donors.
  • SBF began his trading career after William MacAskill's advice, then started FTX
  • SBF's EA commitment public image helped attract investment, and helped distract from SBF's crypto approach which souded like Ponzi scheme to "industry insiders" https://www.bloomberg.com/news/articles/2022-04-25/sam-bankman-fried-described-yield-farming-and-left-matt-levine-stunned
  • SBF fraud illustrates that EA built political culture "practically invites the most egregious forms of capture by the rich."
  • "Few meaningful guardrails exist to stop the rich dictating what happens to the money hoarded in philantropic organisations"
  • Short critique of  longtermism as a guise for tech billionnaires to develop pet projects
  • "In 2021, OpenPhilanthropy donated $80m (£67m) towards the study of potential risks from advanced artificial intelligence, the second-most of any issue the foundation targeted; by contrast, OpenPhilanthropy donated $30m (£25m) to the Against Malaria Foundation, which distributes insecticidal nets. We are uncertain about the impacts of artificial intelligence, but we know there were about 241 million malaria cases causing 627,000 deaths in 2020. Comparing investments in global public health infrastructure to a possible far future universe of billions of digital people is practically and morally dubious. Effective altruism organizations donate hills of cash to research that excites their donors, rather than focus on proven, efficient solutions to imminent needs."
  • asserts that when MacAskill recruited at MIT and acted as liaison with Elon Musk, he courted funders and guaranteed that EA's political agenda would reflect their interests and worldview
  • "Effective" part of EA being questioned as crypto being a fraud is far more likely thatn a "robot apocalypse" or "other speculative “tail risks” effective altruists pretend to be capable of managing"
  • "Altruism" part called into question. Former EA's proposed participatory funding and democratic controls by those who are impacted. Their criticisms were ignored in favour of tech and capital friendly research agenda. 
  • EA are suckers because they would "would hand the reins of social progress to crypto billionaires" 

27

0
0

Reactions

0
0
Comments7
Sorted by Click to highlight new comments since: Today at 10:24 AM
[anonymous]1y17
5
0

Setting ideology aside and the fact that this particular critique would be expected from the Guardian a lot of it is in fact valid, in particular the point about "risk of capture by the rich". Who is truly setting the agenda? 

It reminded me of the relations between temporal and spiritual power in the Middle Ages.

Warmongering kings and lords used Christianity as an excuse for their escapades. At the same time, the Church used the threat of moral ostracism to tame the worst impulses of the ruling class. 

Who was manipulating whom? It's not clear to this day - probably a bit of both.

One thing is certain - no matter how ethical in principle, if your organization owes its existence to the largesse of a single billionaire, it is exceedingly difficult to exercise moral autonomy.

Even if the billionaire never makes any explicit demands, you know deep down that they could theoretically use the threat of withheld funds to get what they want from you.

As a utilitarian, you might even rationalize such a deal to yourself as the best available option.

A few possible paths movements that draw their power from moral legitimacy have found in the past:

  • diverse sources of funding
  • a broad base of support (through small donations)
  • being much harder on / demanding of the powerful. They will seduce the general population but not overtly court those with money and/or guns. Instead they will turn the tables on them, going as far as to have them kneeling and apologizing for their moral transgressions.

Should EA emulate those strategies? Perhaps not the kneeling, but there could be something to be learned from them nonetheless.

This is where I think your politics is relevant.

If you’re on the political left, you will probably have a stronger prior expectation that the excessive influence of individual billionaires like Moskovitz and SBF will move some funding away from what is optimal towards what they find exciting or interesting.

If you are on the political right, I think this prior expectation will be much weaker.

FWIW, I think funding from Moskovitz practically hasn’t moved away from what is optimal, and the only funding from SBF which I thought was spent suboptimally was most of the political stuff.

I’m on the political left, and going forward I think a good approach with billionaires would be to ask them to give their money to Open Phil or not be too personally involved with how their money is spent.

going forward I think a good approach with billionaires would be to ask them to give their money to Open Phil or not be too personally involved with how their money is spent.

This moves control from the single billionaire to a very small group (OpenPhil board). It's one step in the right direction, but isn't nearly enough.

Thanks for the quality summary. I finally opened this post after a couple days of ignoring it in my doomscrolling, because I thought there would be nothing new in it vs. other recent posts on FTX. But I found this critique actually gave me some new things to think about.

I dislike "change course" without a discussion of how. 

Is that really their responsibility? I know this is a long standing argument, but still declining to point out problems just because you don't know how to solve them seems bad.

More specific to this case, the implied meaning seems to me to be "What they're doing is worse than if they didn't exist; they should either solve this problem or disband. We're not going to do it for them."

So you think they are criticising EA for being a playground of the wealthy.

Curated and popular this week
Relevant opportunities