SamDeere

Head of Tech at the Centre for Effective Altruism / Project Lead, EA Funds

Comments

How are the EA Funds default allocations chosen?

First, I'll note that we're actually planning to change this system (likely in the next week or two), so that instead of first seeing a default allocation, donors will choose their own allocation as the first step in the donation process.

To your question, the current EA Funds default allocation was chosen as an approximation of some combination of a) a representative split of the cause areas based on their relative interest across EA, and b) a guess at what we thought the underlying funding gaps in each cause area will likely to be. It's definitely intended to be approximate, and is there partly as a guide to give an indication of how the slider allocation system works, rather than an allocation that we think everyone should choose.

Context: I help run EA Funds and am responsible for the user-facing side of things, including the website

Announcing the 2019-20 Donor Lottery

Update – winners have been drawn!

Thanks everyone who participated this year. The lotteries have been drawn and both had a winner!

Congratulations both!

Effective Altruism Funds Project Updates

Yeah, this is something that's definitely been discussed, and I think this would be a logical first step between the current state of the world and hiring grantmakers to specific teams.

Effective Altruism Funds Project Updates

Yeah, the Fund balances are updated when the entries for the grants are entered into our accounting system (typically at the time that the grants are paid out). Because it can take a while to source all the relevant information from recipients (bank details etc), this doesn't always happen immediately. Unfortunately this means that there's always going to be some potential for drift here, though (absent accounting corrections like that applicable to the Global Development Fund) this should resolve itself within ~ a month. The November balances included ~ half of the payments made from the Animal Welfare and Meta Funds from their respective November grant rounds.

Which Community Building Projects Get Funded?

[meta: apologies for the belated response]

Thanks again for the thoughtful comments. I agree that the numbers should have been higher; that was an oversight (and perhaps speaks to the difficulty of keeping these numbers accurate longer term). I’m not sure how I missed the extra 80K and Founders Pledge grants (I think they came from an earlier payout report that I forgot to include in my calculations). I’m sorry  that this wasn’t done correctly the first time around. 

I’ve since removed the grant amounts (leaving just the grantees/grant categories), and I might re-title the field to just be called ‘Past Grantmaking’ or something similar. We’ve also created a public spreadsheet of all of the EA Funds grants, so they’re accessible in once place.

I added the ‘Grantmaking and Impact’ section to the Funds pages in response to feedback that it was hard to get a feel for what each Fund did in a tangible way, especially for newer donors who hadn’t been following the Funds over time and hadn’t yet dived into the payout reports. The idea here was to give a flavour of the kinds of things that each Fund had granted to, rather than to provide an exhaustive list (that’s what the payout reports are for). I still think that this is valuable, but I agree that keeping the numbers accurate has some problems, so for now we’ll remove them.

Effective Altruism Funds Project Updates

Most Fund balances are in general reasonably accurate (although the current balances don’t account from the latest round that were only paid out last month). The exception here is the Global Development Fund, which is still waiting on the accounting correction you mentioned to post, but I’ve just been informed that this has just been given over to the bookkeepers to action, so this should be resolved very soon.

Effective Altruism Funds Project Updates

1. I don’t have an exact figure, but a quick look at the data suggests we’ve moved close to $2m to US-based charities that don’t have a UK presence from donors in the UK (~$600k in 2019). My guess is that the amount going in the other direction (US -> UK) is substantially smaller than that, if only because the majority of the orgs we support are US-based. (There’s also some slippage here, e.g. UK donors giving to GiveWell’s current recommendation could donate to AMF/Malaria ConsortiumSCI etc.)

2. Due to privacy regulations (most notably GDPR) we can’t, by default, hand over any personally identifying information to our partner charities. We ask donors for permission to pass their details onto the recipient charities, and in these cases stewardship is handled directly by the orgs themselves. CEA doesn’t do much in terms of stewardship specific to each partner org (e.g. we don’t send AMF donors an update on what AMF has been up to recently), but we do send out email newsletters with updates about how money from EA Funds has been spent.

Effective Altruism Funds Project Updates

Yeah, that’s interesting – I think this is an artefact of the way we calculate the numbers. The ‘total donations’ figure is calculated from donations registered through the platform, whereas the Fund balances are calculated from our accounting system. Sometimes donations (especially by larger donors) are arranged outside of the EA Funds platform. They count towards the Fund balance (and accordingly show up in the payouts), but they won’t show up in the total donations figure. We’d love to get to a point where these donations are recorded in EA Funds, but it’s a non-trivial task to synchronise accounting systems in two directions, and so this hasn’t been a top priority so far.

I agree that the YTD display isn’t the most useful for assessing total inflows because it cuts out the busiest period of December (which takes in 4-5 times more than other months, and is responsible for ~35% of annual donations). It was useful for us internally (to see how we were tracking year-on-year), and so ended up being one of the first things we put on the dashboard, but I think that a whole-of-year view will be more useful for the public stats page.

Effective Altruism Funds Project Updates

It’s hard to say exactly, but I’d be thinking this would be on the timescale of roughly a year (so, a spinout could happen in late 2020 or mid 2021). However, this will depend a lot on e.g. ensuring that we have the right people on the team, the difficulty of setting up new processes to handle grantmaking etc.

Re the size question – are you asking how large the EA Funds organisation itself should be, or how large the Fund management teams should be?

If the former, I’d guess that we’d probably start out with a team of two people, maybe eventually growing ~4 people as we started to rely less on CEA for operational support (roughly covering some combination of executive, tech, grantmaking support, general operations, and donor relations), and then growing further if/when demand for the product grew and more people working on the project made sense.

If the latter, my guess is that something like 3-6 people per team is a good size. More people means more viewpoint diversity, more eyes on each grant, and greater surface area for sourcing new grants, but larger groups become more difficult to manage, and obviously the time (and potentially monetary) costs increase.

I’d caveat strongly that these are guesses based on my intuitions about what a future version of EA Funds might look like rather than established strategy/policy, and we’re still very much in the process of figuring out exactly what things could look like.

Effective Altruism Funds Project Updates

I agree with you that on one framing, influencing the long-run future is risky, in the sense that we have no real idea of whether any actions taken now will have a long-run positive impact, and we’re just using our best judgement.

However, it also feels like there are also important distinctions in categories of risk between things like organisational maturity. For example, a grant to MIRI (an established organisation, with legible financial controls, and existing research outputs that are widely cited within the field) feels different to me when compared to, say, an early-career independent researcher working on an area of mathematics that’s plausibly but as-yet speculatively related to advancing the cause of AI safety, or funding someone to write fiction that draws attention to key problems in the field.

I basically tried to come up with an ontology that would make intuitive sense to the average donor, and then tried to address the shortcomings by using examples on our risk page. I agree with Oli that it doesn’t fully capture things, but I think it’s a reasonable attempt to capture an important sentiment (albeit in a very reductive way), especially for donors who are newer to the product and to EA. That said, everyone will have their own sense of what they consider too risky, which is why we encourage donors to read through past grant reports and see how comfortable they feel before donating.

The conversation with Oli above about ‘risk of abuse’ being an important dimension is interesting, and I’ll think about rewriting parts of the page to account for different framings of risk.

Load More