MatthewDahlhausen

564Joined Nov 2014

Comments
77

I don't think anyone is denying that longtermist and existential risk concerns were part of the movement from the beginning. Or think that longtermist concerns don't belong in a movement about doing the most good. I think the concern is around the shift from longtermist concerns existing relatively equally with other cause areas to becoming much more dominant. Longtermism is much more prominent both in terms of funding and attention given to longtermism in community growth and introductory materials.

This post and headline conflate several issues:

  1. Whether democratic control of resources will result in an efficient allocation towards doing the most good
  2. Whether democratic control of resources will attract or deter contributions from wealthy people
  3. Whether donors have a moral claim to their wealth

It seems many of the comments express agreement with 1) and 2), while ignoring 3).

I would hope that a majority of the EA community would agree that there aren't good reasons for someone to claim ownership to billions of dollars. Perhaps there are those that disagree. If many disagree, in my mind that would mark a significant change in the EA community. See Derek Parfit's comments on this to GWWC: http://www.youtube.com/watch?v=xTUrwO9-B_I&t=6m25s. The headline "The EA community does not own its donors' money" may be true in a strictly legal sense, but the EA community, in that some of it prioritizes helping the worst off, has a much stronger moral claim to donors money than do the donors.

It's entirely fair and reasonable to point out the practical difficulties and questionable efficiencies involved with implementing a democratic voting mechanism to allocate wealth. But I think it would be a mistake to further make some claim that major donors ought to be entitled to some significant control over how their money is spent. It's worth keeping those ideas separate.

Thanks for the clarification!

I took the pledge in 2016 which coincided with when the research department disbanded per Jeff's comment. I think that explains why I perceived GWWC to not be in the business of doing evaluations. Glad to see "evaluate the evaluators" is working its way back in.

"One of the roles of Giving What We Can (GWWC) is to help its members and other interested people figure out where to give." Is this a recent addition to the GWWC mission statement? I've been a member for a while and wasn't under the impression that GWWC was in the business of doing charity evaluations or meta-charity/fund evaluations. I assumed GWWC always emphasized the pledge, why to give, how much to give, but not saying much about where to give beyond pointing to GiveWell or whatever. If a big component of GWWC has always been about where to give, I must have missed that. Has GWWC emphasized the where to give piece more in recent years?

The key actors involved in FTX were extremely close to the EA community. SBF became involved after a 1:1? conversation with Will MacAskill, worked at CEA for a short while, held prime speaking slots at EAG, and set up and funded a key organization (FTX fund). Caroline held an officer position in her university EA group. It's fair to say the people at the center of the fraud were embedded and more tightly aligned with the EA movement than most people connected with EA. It's a classic example of high-trust / bad actors - it only takes a few of them to cause serious damage.

Is this just a black swan event? Perhaps. Are there more bad actors in the EA community? Perhaps.

You are certainly welcome to keeping treating EA as high-trust community, but others have good reason not to.

To re-frame this:

  • best: high-trust / good actors
  • good: low-trust / good actors
  • manageable: low-trust / bad actors
  • devastating: high-trust / bad actors

High-trust assumes both good motivations and competence. High trust is nice because it makes things go smoother. But if there are any badly motivated or incompetent actors, insisting on high trust creates conditions for repeated devastating impacts. To further insist on high trust after significant shocks means people who no longer trust good motivations and competence leave.

FTX was a high-trust/bad actor shock event. The movement probably needs to operate for a bit in a low-trust environment to earn back the conditions that allow high-trust. Or, the movement can insist on high-trust at the expense of losing members who no longer feel comfortable or safe trusting others completely.

That comment isn't really an answer. It's just saying "CEA got a grant to purchase a venue dedicated to fancy retreats". The post is asking for an explanation as to why CEA thought this was necessary, useful, and why did they pick such an expensive venue in particular. The comment doesn't answer that.

Judging from similar estates in the area, the Abbey must've cost at least $10,000,000 [EDIT: from other comments it cost 15,000,000 pounds (~$21 million at time of purchase)]. A 30 person conference could easily be held in an office. Regardless, no way CEA hosts 100 of them near Oxford every year. I'm guessing the upkeep costs of the Abbey alone cost more than rent for a generous office.

Besides, if they did want to buy a venue, they could have found one for much cheaper.

This is a luxury purchase. It's made to make visitors feel important and prestigious. It is rightly being criticized and mocked by those outside CEA. WTF were they thinking?

Load More