Recent Discussion

Sam Bankman-Fried (born 6 March 1992) an American investor, entrepreneur and philanthropist. He is the founder and CEO of FTX, a cryptocurrency derivatives exchange, and of Alameda Research, a quantitative trading firm. As of late July 2021, Bankman-Fried's net worth is estimated to be over $16 billion, which makes him the world's richest person under 30 and the world's richest person in crypto (Bambysheva 2021).


Bankman-Fried was born and raised in Stanford. He and his younger brother Gabe were introduced to moral philosophy at a young age by his parents, both consequentialists and professors at Stanford Law School. The brothers eventually became "take-no-prisoners utilitarians" (Fried 2020: xv).

During his third year as a physics major at the Massachusetts Institute of Technology, Bankman-Fried was exposed to the idea of earning to give at a talk by William MacAskill on the ethics of career choice. Upon graduating, he joined Jane Street Capital, where he worked as trader and designed the firm's automated off-exchange trading system. After serving briefly as director of development at the Centre for Effective Altruism, Bankman-Fried founded Alameda Research in 2017 and FTX two years later.

Philanthropic work

Bankman-Fried's fortune has been described as "the product of a long-nurtured utilitarian worldview" (Wallace 2021), and Bankman-Fried himself has been nicknamed "the Bentham of crypto" (Doherty 2021). In the past, he supported organizations working to improve farmed animal welfare, and has more recently also contributed to a variety of longtermist causes (80,000 Hours 2021).

In 2020, Bankman-Fried donated over $5 million to Joseph Biden's presidential campaign, becoming one of Biden's top donors (Wallace 2021).

FTX donates 1% of net fees to effective charities, via the FTX Foundation, which as of July 2021 has contributed over $3.5 million.

Bankman-Fried is a vegan, and has been a member of Giving What We Can since August 2016.


80,000 Hours (2021) Sam Bankman-Fried, 80,000 Hours.

Bambysheva, Nina (2021) Bitcoin alert: biggest private crypto deal ever is closed, Forbes, July 20.

Doherty, Brendan (2021) Icon: The untold story of crypto billionaire Sam Bankman-Fried, Forbes, May 16.

Fried, Barbara H. (2020) Facing up to Scarcity: The Logic and Limits of Nonconsequentialist Thought, Oxford: Oxford University Press.

Schleifer, Theodore (2021) How a crypto billionaire decided to become one of Biden’s biggest donors, Vox, March 20.

Wallace, Benjamin (2021) The mysterious cryptocurrency magnate who became one of Biden’s biggest donors, Intelligencer, February 2.

We are excited to announce the new Open Philanthropy Technology Policy Fellowship. You can apply here until September 15th. This post will provide some background on the fellowship program and details on who we’d be excited to receive applications from.

Other resources for prospective applicants:

  • You can register for information sessions about the fellowship here.
  • We will be running an AMA on the EA Forum soon (link forthcoming).

What is the fellowship?

The fellowship aims to help grow the community of people working at the intersection of US policy and Open Philanthropy’s longtermism-motivated[1] work, especially in AI and biosecurity. It provides selected applicants with policy-focused training and mentorship, and supports them in matching with a host organization for a full-time, fully-funded fellowship based in the Washington, DC area. Potential host organizations include...

More detail from the footnote on the posting, for those interested:

Security clearance requirements are most common for executive branch placements, for example those at national security-focused departments and agencies. Permanent residents are able to work in most Congressional offices and think tanks, and may also be eligible for certain executive branch placements.

One question that I'm curious about with respect to EA strategy is the extent to which people are or are not willing to change not just specific organizations they might donate too, but overall cause areas that they consider important focuses. I'm especially interested in "average donors" and people who are not necessarily explicitly EA -- I think this question is quite meaningful with respect to what kinds of outreach most generate value in the world.

Is there any research into this topic people could point me to, either from within the EA community or elsewhere?

Rethink Priorities' analysis of the 2019 EA survey concluded that 42% of EAs changed their cause area after joining the movement, 57% of change was away from global poverty, and 54% towards long term future / catastrophic and existential risk reduction.

Rethink Priorities, and Faunalytics also have much content on how to do effective animal advocacy, which would likely be useful for your purposes.

This is probably not the extent of research that Rethink Priorities has on this issue, but it's what I could remember reading about.

3Stefan_Schubert2hThere are some studies suggesting people sometimes donate to less effective charities even when informed that other charities are more effective. E.g. this paper [] found that people prefer to donate to cancer research even when told that arthritis research is more effective. We made similar findings in this paper []. These papers just ask one-off questions, though - they don't concern whether sustained persuasion would cause people to change cause area. But they do indicate that preferences for particular cause areas often override effectiveness information.
Long-range forecasting


Goth, Aidan & Stephen Clare (2020) Dr. Philip Tetlock’s forecasting research, Founders Pledge, November 27.
This report discusses plans for "work on methodological questions with an eye towards hosting a forecasting tournament focused on global catastrophic risks in summer 2021."(Read More)

This is a general problem: for many entries, posts can be potentially relevant by virtue of either discussing the topic of the entry or exemplifying the phenomenon the entry describes. So we probably want to think about possible general ways to deal with this problem rather than solutions for this specific instance. Still, it seems fine to discuss that here. I don't think I have any insights to offer off the top of my head, but will try to think about this a bit more later.

4Pablo2hCool. I've now "followed" the author on Google Scholar [] to be alerted whenever he publishes something new.
8vaidehi_agarwalla13hQUICK BOTEC OF PERSON-HOURS SPENT ON EA JOB APPLICATIONS PER ANNUM. I created a Guesstimate model [] to estimate a total of ~14,000 to 100,000 person-hours or ~7 to 51 FTE are spent per year (90% CI). This comes to an estimated USD $ 320,000 to $3,200,000 unpaid labour time. * All assumptions for my calculations are in the Guesstimate * The distribution of effort spent by candidates is heavy-tailed; a small percentage of candidates may spend 3 to 10x more time than the median candidate. * I am not very good at interpreting the guesstimate, so if someone can state this better / more accurately than would be helpful * Keen to get feedback on whether I've over/underestimated any variables. * I'd expect this to grow at a rate of ~5-10% per year at least. Sources: My own experience as a recruiter, applying to EA jobs and interviewing staff at some EA orgs. Edited the unpaid labour time to reflect Linch's suggestions.
3Linch7hI think As a normal distribution between $20-30 is too low, many EA applicants counterfactually have upper middle class professional jobs in the US. I also want to flag that you are assuming that the time is but many EA orgs do in fact pay for work trials. "trial week" especially should almost always be paid.

Hi Linch, thanks for the input!

I'll adjust the estimate a bit higher. In the Guesstimate I do discount the hours to say that 75% of the total hours are unpaid (trial week hours cone to 5% of the total hours).

Original post by Holly Elmore

Podcast description

Read & edited by: Garrett Baker

Spoilers ahead — listen to the episode beforehand if you don’t want to hear a rough summary first.

I quite liked the "Playing God" episode of RadioLab.

The topic is triage, the practice of assigning priority to different patients in emergency medicine. By extension, to triage means to ration scarce resources. The episode treats triage as a rare phenomenon– in fact, it suggests that medical triage protocols were not taken very seriously in the US until after Hurricane Katrina– but triage is not a rare phenomenon at all. We are engaging in triage with every decision we make.

The stories in “Playing God” are gripping, particularly the story of a New Orleans hospital...

You picked a good one here.

1Madhav Malhotra15hI'm sorry, I'm not sure what this is. I'm new here :D Could you explain? It seems like this is the description of the podcast, which is someone reading out another post somewhere on the forum?
1D0TheMath14hYes, it’s a linkpost to my podcast here [] , where myself and others have been narrating selected forum posts.

A response to Aaron Gertler's you should write about your job

When strangers ask me what I do, I often respond "I do drugs", and get a kick out of the confusion/amusement that appears on their faces.


I've know I wanted to become a chemist since I was 14. And I did.

I finished my master's in organic chemistry (more specifically organic synthesis, the science of assembling simple molecules into complex ones) and tried one year of PhD. Then I switched to the private sector, and have not regretted the decision for a day since.

In my experience, since private companies spend their own money, rather than the public's, and they care about feasibility more than about appearances, work is a lot more fast-paced in the private sector. Good enough...

Could you tell me more about how you decided what you wanted to do at age 14? 
I'm 18 and I still have very little clue :D 

Open Philanthropy (OP) is a the single largest donor in the EA space (with the possible exception of the Gates Foundation's global health interventions, though these are not generally considered part of EA) and so how much money they grant, and to what causes, have major implications for what the EA and longtermist landscape looks like. It has a bearing on many issues, including:

  • Whether an individual should pursue direct work, or earn to give
  • If one is earning to give, whether to give now vs later
  • Whether there will be sufficient funding in a specific area for individuals to aim to have a career in that area
  • Influencing the Overton window of what is considered an EA cause

In recent years, OP has been granting about $280m per year ($298m in...

2Charles_Dillon 17hAgreed, and changed, though I preferred "grants" to "grant amounts"
2Charles_Dillon 17hThis is a good idea - I considered making the original questions averages for this reason, but erred on this side of making the question simpler. As is, I think the variance around the underlying distribution outcomes is large enough to compensate for the variance in year to year grants, such that I would not expect a big difference between 2028-2032 average predictions and 2030 predictions, and I'm hesitant to ask too many questions until the current ones have received sufficient attention.

That's a good point I didn't consider. I think you're right that the average question would not be very helpful, and not helpful enough to be worth adding.

I like the examples in the guide! Thank you for sharing that with me :-) 
I do say things that he mentioned like males/females instead of men/women or 'a high probability of' instead of 'probably'. 

I'll start working on that!