Trish

102Joined Aug 2022www.tosummarise.com

Bio

Hi! I'm somewhat new to EA - I'd heard of the ideas years ago but only started engaging with the community  recently after doing the Intro to EA Virtual Program. 

I work in International Tax Policy and am more sympathetic to neartermist causes such as global health and poverty reduction than longtermist ones. 

I read a lot of non-fiction books and summarise them on my website, To Summarise. 

How others can help me

I am particularly keen to meet other EAs in the policy space 

How I can help others

Reach out to me if you want to have a chat about anything, really.

Comments
17

Upvoted with the benefit of hindsight bias. 

In particular, I'm impressed with how these parts hit the nail on the head for recent events:

On average we are young and inexperienced.  We have not yet experienced a scandal / major problem and have not yet started to think through how to avoid that happening again.

And:

Many EA organisations have very concentrated sources of funding.  Donors therefore often fulfil the accountability and governance functions.  Donors are analogous to customers, which can wield significant influence in for-profits.

I guess it depends on what we mean by "vetting funding". EA should definitely do more to understand  and manage the nature and extent of the risks it is exposed to - i.e. general risk management. I don't think we need to wait for much more information to make such an assessment - the way this has unfolded with so many grant recipients, etc seeming to have been caught completely unprepared is enough evidence that the EA community's risk management and communication were lacking. 

But some people also seem to suggest "vetting funding" means EA should be trying to find fraud or other malfeasance in its donors. (That is suggested by the OP's post and is what I meant by "vetting funding" in my previous comment). I'm less sure about this claim. It's not clear how much due diligence is required in these cases vs how much due diligence EA actually did. So this is something that, as the OP suggests, would benefit from more information before coming to a conclusion. 

Edit: Put another way, I think there are two questions:

  • Should EA have conducted better risk management? I think the answer on this question is quite clearly "yes". 
  • Would that "better risk management" have detected this fraud? 

I've seen some discussions conflating the two. Even if the answer to the second is "no "or "unclear",  EA's risk management practices could still be improved. That's all I'm saying. 

Answer by TrishNov 13, 202296

I've commented on a separate post here. 

In short: I'm not sure EA could have prevented or predicted this particular event with FTX blowing up. 

However, EA did know that its funding sources were very undiversified and volatile, and could have thought more about the risks of an expected source of funding drying up and how to mitigate those risks.

Adjacently, some are arguing EA could have vetted FTX and Sam better, and averted this situation. This reeks of hindsight bias! Probably EA could not have done better than all the investors who originally vetted FTX before giving them a buttload of money!

 

I've seen this mentioned quite a few times, most prominently by Eliezer Yudowsky . I take the point that there were sophisticated investors such as Sequioa, and BlackRock who researched the company and could not detect FTX's possible self-dealing with Alameda. I think it's fair to say that EA probably could not have detected this FTX situation would blow up in exactly the way it did, even with more due diligence.

I also think that it's rational to expect that you apply due diligence to where you are investing your money than where you are receiving it from - and my understanding is that EA (on the whole) was not actually investing in FTX. 

However, what I think should not be lost sight of is that FTX funding made up a very significant amount of EA's funding as a whole: in 2021, it was estimated FTX team's funding made up $16.5 billion of  $46.1 billion (roughly 36%).  (Moskovitz's funding was even larger - roughly 49%.) 

This is incredibly undiversified, especially given how volatile SBF and Moskovitz's wealth is . I am sure  that this is far more undiversified than any large investor who actually put money in FTX. EA therefore stands to lose a lot more if the funding from FTX (or Moskovitz) fell away. I don't want this point to get lost in the debate.

Now, I'm not sure that the answer was that EA should have vetted its funding more. When people are offering you "free" money,[1] I don't think there is too much obligation to vet it (above any legal obligations that might exist).  I think the answer is probably that EA should have thought about its risk exposure more, given how undiversified and volatile its funding is.  In particular:

  • what might happen if FTX or Moskovitz's funding dries up for any reason; and 
  • how any negative consequences of that situation could be mitigated. 

For example, I did see somewhere that there had been a statement somewhere suggesting that EAs personally should diversify away from crypto, given how exposed EA is to it generally, but that did not seem to be a very prominent, widely-advertised piece of advice. 

I know also that there is some general career advice for people to build up a decent financial runway for themselves. Perhaps there should have been greater emphasis to the community that if they rely on funding  (grants, salaries) that is undiversified, they should factor that risk in and weigh it with their personal risk appetite.  

  1. ^

    I leave aside the question of whether SBF was using EA to "launder" his reputation and therefore arguably the money was not entirely "free". I don't have an informed view on that. 

Thanks for sharing. I've never heard of this Anki deck idea before and am intrigued. I have used Anki before for language learning. But I really like your idea of more deliberately "consuming" pleasant memories - I've often thought it's a shame how little we savour most of our nice memories.

If you don't mind me asking:

  • How many cards are in your deck right now? 
  • Roughly how often do you add new memories to it? 
  • Do you review daily, weekly, or whenever you feel like it?

Thank you for posting this. I haven't been in any of these EA circles myself, so do not have any experience with these issue. The more widespread this is, the more important this conversation is.  But even if it only happens in one or two EA circles, it is still important. 

I'm sorry to hear that this has been your experience with the NYC and SF EA circles.

I agree with you that the original comment, taken literally, is probably false and that EAs consorting with billionaires can still retain some power. 

But I think the original comment by Berta had a good point in that there seemed to be a general naivete by EA about power and other people's intentions. However, that is just from my vantage point as someone who does not work at an EA org and is not in any inner EA circle. 

I think recent events have definitely lowered the general level of trust within the EA community. But that is not necessarily a good thing and I hope EA does not overcorrect, either. Getting the balance right will be tricky, but I think Berta was on the right track in that EA could benefit from thinking and talking about power more. 

I've only just stumbled upon this question and I'm not sure if you'll see this, but I wrote up some of my thoughts on the problems with the Total View  of population ethics (see "Abortion and Contraception" heading specifically). 

Personally, I think there is a tension there which does not seem to have been discussed much in the EA forum. 

I've just been reading up on earlier EA forum posts about democracy and billionaire spending in light of the FTX saga that broke this week. 

This comment did not age well. 

Thanks for the explanation. I agree it's possible that smarter people could coordinate better and produce better outcomes for the world.  I did recognise in my original post that a factor suggesting the future could be better was that, as people get richer and have their basic needs met, it's easier to become altruistic. I find that argument very plausible; it was the asymmetry one I found unconvincing.  

FWIW, I'm fine with others disagreeing with my view. It would be great to find out I'm wrong and that there is more evidence to suggest the future is rosier in expectation than I had originally thought. I just wanted people to let me know if there was a logical error or something in my original post, so thank you for taking the time to explain your thinking (and for retracting your disagreement on further consideration).

Load More