calebp

I currently lead EA funds.

Before that, I worked on improving epistemics in the EA community at CEA (as a contractor), as a research assistant at the Global Priorities Institute, on community building, and Global Health Policy.

Unless I explicitly stated otherwise, opinions are my own, not my employer's.

You can give me positive and negative feedback here.

Comments

Critiques of EA that I want to read

I found this helpful and I feel like it resolved some cruxes for me. Thank you for taking the time to respond!

Critiques of EA that I want to read

Thanks for writing this post, I think it raises some interesting points and I'd be interested in reading several of these critiques.

(Adding a few thoughts on some of the funding related things, but I encourage critiques of these points if someone wants to write them)

Sometimes funders try to play 5d chess with each other to avoid funging each other’s donations, and this results in the charity not getting enough funding.

I'm not aware of this happening very much, at least between EA Funds, Open Phil and FTX (but it's plausible to me that this does happen occasionally). In general I think that funders have a preference to just try and be transparent with each other and cooperate. I think occasionally this will stop organisations being funded, but I think it's pretty reasonable to not want to fund org x for project y given that they already have money for it from someone or take actions in this direction. I am aware of quite a few projects that have been funded by both Open Phil and FTX - I'm not sure whether this is much evidence against your position or is part of the 5d chess.

Sometimes funders don’t provide much clarity on the amount of time they intend to fund organizations for, which makes it harder to operate the organization long-term or plan for the future. Lots of EA funding mechanisms seem basically based on building relationships with funders, which makes it much harder to start a new organization in the space if you’re an outsider.

This is a thing I've heard a few times from grantees, I think there is some truth to it, although most funding applications that I see are time bounded anyway and we tend to just fund for the lifetime of specific projects or orgs will apply for x years worth of costs and we provide funding for that with the expectation that they will ask for more if they need it. If there are better structures that you think are easier to implement I'd be interested in hearing them, perhaps you'd prefer funding for a longer period of time conditional on meeting certain goals? I think relationships with funders can be helpful but I think it is relatively rarely the difference between people receiving funding and not receiving it within EA (although this is pretty low confidence). I can think of lots of people that we have decided against funding who have pretty good professional/personal relationships with funders. To be clear, I'm just saying that pre-existing relationships are NOT required to get funding and they do not substantially increase the chances of being funded (in my estimation).

Relatedly, it’s harder to build these relationships without knowing a large EA vocabulary, which seems bad for bringing in new people. These interactions seem addressable through funders basically thinking less about how other funders are acting, and also working on longer time-horizons with grants to organizations.

I think I disagree that the main issue is vocabulary, maybe there's cultural differences? One way in which I could imagine non EAs struggling to get funding for good projects is if they over inflate their accomplishments or set unrealistic goals as might be expected when applying to other funders, if probably think they had worse judgement than people who are more transparent about their shortcomings and strengths or worry that they were trying to con me in other parts of the application. This seems reasonable to me though, I probably do want to encourage people to be transparent.

Re funders brain drain

I'm not super convinced by this, I do think grantmaking is impactful and I'm not sure it's particularly high status relative to working at other EA orgs (e.g. I'd be surprised if people were turning down roles at redwood or Arc to work at OPP because of status - but maybe you have similar concerns about these orgs?). Most grantmakers have pretty small teams so it's plausibly not that big an issue anyway although I agree that if these people weren't doing grant making they'd probably do useful things elsewhere.

Transcript of Twitter Discussion on EA from June 2022

I know this isn't the point of the thread but I feel the need to say that if people think a better laptop will increase their productivity they should apply to the EAIF.

https://funds.effectivealtruism.org/funds/ea-community

(If you work at an EA org, I think that your organisation normally should pay unless they aren't able to for legal/bureaucratic reasons)

Is the time crunch for AI Safety Movement Building now?

I think that Holden assigns more than a 10% chance to AGI in the next 15 years, the post that you linked to says 'more than a 10% chance we'll see transformative AI within 15 years'.

Sam Bankman-Fried should spend $100M on short-term projects now

SBF/FTX already gives quite a lot to neartermist projects afaict. He's also pretty open about being vegan and living a frugal lifestyle. I'm not saying that this mitigates optics issues, just that I expect to see diminishing marginal returns on this kind of donation wrt optics gains.

https://ftx.com/foundation

Some unfun lessons I learned as a junior grantmaker

The policy that you referenced is the most up-to-date policy that we have but, I do intend to publish a polished version of the COI policy on our site at some point. I am not sure right now when I will have the capacity for this but thank you for the nudge.

Some unfun lessons I learned as a junior grantmaker

My impression is that Linch's description of their actions above is consistent with our current COI policy. The Fund chairs and I have some visibility over COI matters, and fund managers often flag cases when they are unsure what the policy should be, and then I or the fund Chairs can weigh in with our suggestion. 

Often we suggest proceeding as usual or a partial but not full recusal (e.g. the fund manager should participate in discussion but not vote on the grant themselves).

Deferring

(I think that the pushing towards a score thing wasn't a crux in downvoting, I think there are lots of reasons to downvote things that aren't harmful as outlined in the 'how to use the form post/moderator guidelines')

I think that karma is supposed to be a proxy for the relative value that a post provides.

I'm not sure what you mean by zero-sum here, but I would have thought that the control system type approach is better as the steady-state values will be pushed towards the mean of what users see as the true value of the post. I think that this score + total number of votes is quite easy to interpret.

The everyone voting independently thing performs poorly when some posts have much more views than others (so it seems to be tracking something more like how many people saw it and liked it rather than is the post high quality).

I think I misunderstand your concern, but the control system approach seems, on the surface to be much better to me, but I am keen to find the crux here, if there is one.

Deferring

I don't think we should only downvote harmful things, we should instead look at the amount of karma and use our votes to push the score to the value we think the post should be at.

I downvoted the comment because:

  • Saying things like "... obviously push an agenda...." And "I'm pretty sure anyone reading this... " Has persuasiony vibes which I don't like.
  • Saying "this post says people should defer to authority" is a bit of a straw/weak man and isn't very charitable.
Deferring

I think I roughly agree althought I haven't thought much about the epistemic vs authority deferring thing before.

Idk if you were too terse, it seemed fine to me. That said, I would have predicted this would be around 70 karma by now, so I may be poorly calibrated on what is appealing to other people.

Load More