blonergan

54Joined Mar 2020

Comments
13

Some updates in EA communications

This is wonderful news! 

A couple of comments on the new intro to EA article:

  1. The graph in the “Helping create the field of AI alignment research” is an interesting one, but it takes up a lot of space given that it isn’t about the main point of the section.  It seems like the section is about “AI will probably be a big deal and the EA community has helped create and populate the AI alignment field, which is trying to increase the likelihood that AI is beneficial” whereas the graph says “the Industrial Revolution was a big deal” which is somewhat relevant but doesn’t seem to warrant a giant graph in my opinion. Also, some readers might wonder if the graph merely reflects constant exponential growth (my understanding is that it doesn't, but it's not obvious to me by looking at it).
  2. Under “Improving decision-making,” I don’t find the Metaculus example very compelling. The text suggests but does not establish that the forecasting community was ahead of consensus public or expert opinions. And it’s not clear to me what people/entities changed, or could have changed, their decisions in a way that would have been beneficial to humanity by using the Metaculus forecast. Maybe that's obvious to other people though!
Crypto markets, EA funding and optics

On future funding flows, I specifically said "[i]n the event of a crypto crash, fewer new projects would be funded, and the bar for continuing to fund existing projects would be higher," so I don't think we disagree about that. But I disagree with the "lots of good projects (would) have to be ended" statement in your original post.

Crypto markets, EA funding and optics

I've listened to SBF on several podcasts, and I haven't gotten the impression that he thinks all cryptocurrencies are useless. I would recommend this one in particular  https://clearerthinkingpodcast.com/episode/038. I'm personally skeptical about the value of cryptocurrencies (relative to their current valuation), and my opinion on some things differs from SBF's, but I find him to be one of the few people who work in the crypto space that articulate balanced and insightful views on crypto. 

Also, SBF did not use the work "Ponzi." That was Matt Levine's interpretation. I think what SBF was describing would be better characterized as a speculative bubble, since "Ponzi" implies an intent to defraud. A well intentioned founder might have a crypto-based idea they are excited about. If investors/speculators bid the value of their coin/token to unreasonable values, that doesn't mean the founder has devised a Ponzi scheme. Note that SBF said "ignore what it does or pretend it does literally nothing" about the "box," which implies that he thinks most crypto projects are at least trying to do something.

I would respectfully recommend editing your post where it says that SBF admitted cryptocurrencies are a Ponzi scheme. I believe strongly that it is not accurate as stated.

As for current EA spending vs. wealth, I think we are in a situation where, as a rough guess, 40% of EA wealth is in crypto, and current spending is 2-3% of wealth. If the crypto portion were mostly wiped out, current levels could be sustained by donors who are less invested in crypto. In the event of a crypto crash, fewer new projects would be funded, and the bar for continuing to fund existing projects would be higher, but I think non-crypto donors would step up to continue to fund projects that are going reasonably well. In the meantime, there is benefit from funding some new things and learning about what works well. If current spending were 5% of wealth, and if it seemed unlikely that new EA-aligned donors would emerge, I would be more concerned.

Crypto markets, EA funding and optics

I don't believe it's accurate to say:

"Sam Bankman Fried has admitted that cryptocurrencies like Bitcoin are a ponzi scheme"

I don't think he was talking about Bitcoin specifically or about all cryptocurrencies in that exchange with Matt Levine. 

At this point, I think the risk of good projects being terminated if crypto declines further is fairly low, given that current EA spending is a small percentage of EA wealth.

I share your concern about reputation risk from people associating EA with crypto, but that has to be weighed against the benefits (e.g. possible further wealth creation, and opportunities for SBF and others to spread EA ideas).

Ballot transparency project

Ballot Ready (https://www.ballotready.org/) has useful information for at least some candidates.  I don't know how much of the US they cover.

Why Effective Altruists Should Put a Higher Priority on Funding Academic Research

I’m also sympathetic to the argument, but I think the BOTEC overstates the potential benefit for another reason. If Givewell finds an opportunity to give $100 million per year at an effectiveness of 15x of cash transfers rather than 5x (and assuming there is a large supply of giving opportunities at 5x), I think the benefit is $200 million per year rather than $1 billion. The $100 million spent on the 15x intervention achieves what they could have achieved by spending $300 million on a 5x intervention. Of course, as noted, that is for only one year, so the number over a longer time horizon would be much larger.

Even with that adjustment, and considering the issues raised by David Manheim and other commenters, I find this post quite compelling – thank you for sharing it.

What is the overhead of grantmaking?

I think the appropriate cost to use for evaluators, applicants, and admins is the opportunity cost of their time. For many such people this would be considerably higher than their wage and outside the ranges used in the model. I don't know that this would change your conclusion, but it could significantly affect the numbers.

Jobs at EA-organizations are overpaid, here is why

Keeping the set of EA org employees fixed, and paying them more, I think higher salaries have three effects:

  1. EA org employees will donate more. This portion is a sort of regranting mechanism. I would expect such people to be effective regranters, so this feels like a small win.
  2. EA org employees will practice better self-care and invest in things that save them time and allow them to work more. They will be more productive as a result. Given the scarcity of talent, this feels like a big win.
  3. EA org employees will have higher standards of living. This feels like a net loss, given the potential alternative uses of funds.

My intuition is that EA org salaries are low enough, and talent is scarce enough, that (2) probably dominates.

There’s also the consideration of how the set of people working at EA orgs will change.

  1. More people will be willing to work for EA orgs.
  2. The set of people who become willing to work for EA orgs as salaries go up will be different from the people willing to work at lower salaries.

Keeping the set of EA org jobs fixed, the pool of people willing to take those jobs will expand. I would guess with higher salaries the people hired would tend to be more talented, and less “totalising” in their commitment to EA. The former seems good, whereas the latter seems bad for some roles, but perhaps good for others. I think it’s important to recognize that people’s willingness to work for a low salary depends on many factors. In particular, families (parents, spouses, children) can be significant financial resources or burdens. So low salaries are an imperfect way to filter for level of commitment.

And, given the relative scarcity of EA talent vs. funding, making EA org work more attractive (relative to earning to give) seems valuable to me. With higher salaries the number of EA org jobs will tend to expand due to an increased supply of workers, which seems good on the margin.

Does it make sense for EA’s to be more risk-seeking in earning to give?

If we are taking the assumed donor behavior as given, and if the sole objective is maximizing donations to charity, this makes sense. But there is an available option that would be better for both the EA that is earning to give and the charity. The E2Ger could take the $100k job and donate 32%.  With even slightly diminishing marginal utility of consumption, the E2Ger would be better off consuming $68k with certainty than having a 80% chance of consuming $45k, a 10% chance of consuming $50k, and a 10% chance of consuming $275k. And the charity would get slightly more in expectation ($32k rather than $31.5k). 

In practice, I think there is usually a tradeoff between risk and expected value when choosing among E2G jobs/careers, so choosing riskier options and donating a higher percentage when outcomes are favorable will tend to be the right policy. I'm just not sure that the main argument presented here strengthens the case for doing so.

What are your giving recommendations to non-EA friends interested in supporting Ukraine?

The Center for High Impact Philanthropy at U Penn has posted a list: https://www.impact.upenn.edu/ukraine-crisis-how-can-i-help/

Load More