Edit: Thanks to the generosity of our donors, the matching funding has now been used, and we've reached our target!

With just a couple of days to go of our spring fundraiser, we are still £40,000 off our target. We really need your help to help us reach our goal, so that we can make concrete plans for our future, and go back to our main job — getting more people to take the pledge to give.

You can still double the effect of your donation —there is still £10,000 of matching funding for new donors – but it has to be before July! 


Over 2014 our membership doubled. Over 2015, using the money we're currently trying to raise, we aim to double again. We plan to do this by focusing on chapter growth and by capitalising on the media attention around Effective Altruism over the summer, to make sure that it results in concrete impact.

 

It’s pretty amazing how much we can each do individually by donating 10% of our incomes to the most effective causes. Someone on the median US wage could save a person's life every year! But if we could make it the norm in the developing world, we really could end extreme poverty around the world. Hopefully this summer will be the start of making that happen!

7

0
0

Reactions

0
0
Comments15


Sorted by Click to highlight new comments since:

Hey Michelle, did you release the cohort data I suggested donors might require to evaluate GWWC?

As far as I know, not all of that data you were asking about has been released, but some of it has and is in the impact assessment (notably 1.7% is a figure which is something like an average annual rate for members leaving, and 4.7% the equivalent for members losing touch with GWWC).

Thanks Owen.

My understanding was GWWC said that 1.7% was the average annual rate for members who admitted they were leaving, with an additional 4.7% of members not responding for two years. Unfortunately, these are not quite the numbers we want, and are likely to be too optimistic for a few reasons:

  • These numbers don't include people who claim to be donating but are not. (Does GWWC attempt to verify donations? )
  • They also don't include any sense of cohort differentiation - how do the early members compare to the later members? Is this number skewed by all of Dr Ord's friends who joined early on?
  • Nor did they break down students donating 1% vs real people donating 10%. The latter are much more valuable, but I would also expect a significant number of people to drop-out when they switch from the relatively undemanding student pledge to the full 10% pledge. As such, if the current mass of members is very student-heavy, high historical retention rates may not generalize to when they leave and become full members.
  • How does the dropout curve change for an individual cohort. Do most of the people leaving leave in the early years of the pledge (good) or does the rate of drop-out increase over time (bad)?

Aside from this optimistic bias, there are a few other reasons to want cohort data

  • It would be nice to be able to reconcile that very low drop-out rates with the very high ones GWWC published a few years ago here, which showed only 65-70% claimed retention after 0-2 years. Right now it seems hard to understand how these are consistent.
  • 2-year grace period is very long. In finance we call a loan non-performing if they are 90-days behind! Worse, GWWC is young and has seen exponential growth, so a 2-year wait period means no data on anyone who joined since June 2013.
  • A few people have expressed concerns to me, both publicly and privately, that GWWC has discovered some very negative facts about their membership. Standardized disclosure, in a form chosen by a third party, can go a long way towards dispelling these fears. This is why public companies have to report in accordance with US GAAP, rather than getting to choose their own metrics.
  • Finally, it would be nice to have some of the mundane, technical details as well. How was this average calculated? etc. With cohort data we don't need to speculate and argue about the virtues of different ways of computing hazard rates, we can just do our own calculations.

Agree that the cohort data looks helpful for the reasons you mention (presuming a careful look at any privacy issues with releasing it checked out). I'll respond to a couple of the points:

(Does GWWC attempt to verify donations? )

I don't think so, and guess this is probably the right move despite the costs of lacking verification (aside from the obvious costs of data gathering, I'm worried about the subtle costs of making giving for the pledge feel more like an obligation than an opportunity, and this turning people off and leading to fewer donations, and fewer people encouraging others to join).

It would be nice to be able to reconcile that very low drop-out rates with the very high ones GWWC published a few years ago here, which showed only 65-70% claimed retention after 0-2 years. Right now it seems hard to understand how these are consistent.

I think what's going on here is that the 65-70% figures were mostly response rates on a survey of members. More recently GWWC has gone to rather more trouble to follow up with members who don't initially answer the survey, in order to get better data (as it was unclear how many of the 30-35% non-respondents were still donating).

2-year grace period is very long. In finance we call a loan non-performing if they are 90-days behind! Worse, GWWC is young and has seen exponential growth, so a 2-year wait period means no data on anyone who joined since June 2013.

I see the sense in which it feels long. On the other hand, as the frequency of asking for responses from members is annual, this amounts to "missed two cycles". Counting it as dropping out with one missed cycle seems like it might catch too many people. This one seems pretty thorny to me, since more frequent data would be quite useful for analytics, but on the other hand impose rather larger burdens on both GWWC staff and members. And to the extent that what we care about is long-term donation behaviour, we may just need to wait for data.

Disclaimers: I work for the Global Priorities Project, part of the same umbrella organisation (CEA) as GWWC, and I was acting director of research at GWWC during 2014. I don't know all the operational details, though.

Thanks for your interest, Dale, and Owen for responding so thoroughly. I overall agree that there is a lot of information it would be nice to get at, although the numbers are somewhat small if we're trying to work out the difference between, say, the 2009 cohort and the others (since there were only around 30 members in 2009). Now that I can spend less time on fundraising, I'll try to put together a post about this. Just to add a couple of points to what Owen has said: On the question of students to earning - that was definitely a time we were worried people would drop out. The data doesn't seem to suggest that's the case though - so far it seems that people do tend to actually start giving 10% when they start earning. On the verifying donations - we have in the past compared AMF's data on donors with member self-reporting. While the self-reporting was far from perfect, people were at least as often under-reporting as over-reporting. (And most often discrepancies simply turned out to be mistakes about when the person had donated.) For the reasons Owen mentions, we did this once (to get a sense of whether reporting was approx accurate) but we aren't planning to do it again.

On the 2009 cohort -- would it make sense to bucket this with the 2010 cohort? (So treating the first 14 months of GWWC as one cohort, and in years thereafter)

It's worth reassuring people that even if the full goal isn't met it isn't a disaster - there isn't a funding gap for keeping GWWC itself going, which could presumably be done quite cheaply. I know there's a perception that this fundraising round has been a struggle, and there's been a lot of scepticism about it (e.g. (here)[https://www.facebook.com/groups/effective.altruists/permalink/882996591756699/]). But that isn't that damning: it was bound to happen at some point at which GWWC asked for more money to fund more paid employees, rather than keeping going until GWWC got as much money as the people who've signed it's pledge are giving.

It's worth reassuring people that even if the full goal isn't met it isn't a disaster - there isn't a funding gap for keeping GWWC itself going, which could presumably be done quite cheaply.

I agree that it's important to understand that not reaching the goal doesn't mean collapse. I do think opportunity cost on growth opportunities could be quite high, though -- it's not clear that there will be any marginal opportunities for movement growth this effective in a few years' time (my thinking on this is here on general timing of giving and here on how to think about the value of marginal movement growth).

... it was bound to happen at some point at which GWWC asked for more money to fund more paid employees, rather than keeping going until GWWC got as much money as the people who've signed it's pledge are giving.

Interestingly it sounds like you're thinking of GWWC expanding in terms of the slice of pledged donations it's consuming. It could be nice to work out the numbers on this, but using remembered figures I think it's approximately constant (i.e. GWWC operations are expanding proportionally with members), and order of 10% of current donation flows / <1% of (flow of increase of pledges).

Thanks for the response. You're right, the relevant question isn't keeping GWWC going but is if there are promising new growth opportunities that the extra funding will pay for, which you say there are this year. There's also some neatness to GWWC being able to support/grow its staff with 10% of the total yearly donations of the people who pledge 10% via it. I haven't thought whether that's the appropriate model, but it provides a way to pace growth rather than keeping going until you've matched their total donations.

How would one tell the difference between extra members which came for "capitalising on the media attention around Effective Altruism over the summer", and 10% donors who simply got rustled up by this attention? Has GWWC publicly advertised conditions in which the money spent on this wouldn't have been worthwhile and shouldn't have been diverted to it?

How would one tell the difference between extra members which came for "capitalising on the media attention around Effective Altruism over the summer", and 10% donors who simply got rustled up by this attention?

Joining members are asked what they likely would have given if not joining. This is quite a noisy process since people's estimates of what they would have given will often be inaccurate, but I think it works as a first-order correction (and it's hard to see how to do better). This is factored into the impact assessment.

Oh I meant how you distinguished between people who signed up to the pledge after seeing GWWC mentioned in the media attention or book (or elsewhere), and people who were a result of the efforts capitalising on this that EA donors are funding. For the question you answered I agree, I can't think of any better (or other) data to get about individual pledgers and the only thing to compare it to is an overall estimate of the extra donations a pledge could lead to.

Sorry for the misunderstanding! Yeah, it looks kind of hard to distinguish. Maybe someone has a clever method. The only thing I can immediately think of is an RCT on follow-up to different bits of media coverage, but I expect this would be super-messy to run and might not produce great data.

Congrats on using up the matching funding!

Curated and popular this week
 ·  · 10m read
 · 
I wrote this to try to explain the key thing going on with AI right now to a broader audience. Feedback welcome. Most people think of AI as a pattern-matching chatbot – good at writing emails, terrible at real thinking. They've missed something huge. In 2024, while many declared AI was reaching a plateau, it was actually entering a new paradigm: learning to reason using reinforcement learning. This approach isn’t limited by data, so could deliver beyond-human capabilities in coding and scientific reasoning within two years. Here's a simple introduction to how it works, and why it's the most important development that most people have missed. The new paradigm: reinforcement learning People sometimes say “chatGPT is just next token prediction on the internet”. But that’s never been quite true. Raw next token prediction produces outputs that are regularly crazy. GPT only became useful with the addition of what’s called “reinforcement learning from human feedback” (RLHF): 1. The model produces outputs 2. Humans rate those outputs for helpfulness 3. The model is adjusted in a way expected to get a higher rating A model that’s under RLHF hasn’t been trained only to predict next tokens, it’s been trained to produce whatever output is most helpful to human raters. Think of the initial large language model (LLM) as containing a foundation of knowledge and concepts. Reinforcement learning is what enables that structure to be turned to a specific end. Now AI companies are using reinforcement learning in a powerful new way – training models to reason step-by-step: 1. Show the model a problem like a math puzzle. 2. Ask it to produce a chain of reasoning to solve the problem (“chain of thought”).[1] 3. If the answer is correct, adjust the model to be more like that (“reinforcement”).[2] 4. Repeat thousands of times. Before 2023 this didn’t seem to work. If each step of reasoning is too unreliable, then the chains quickly go wrong. Without getting close to co
JamesÖz
 ·  · 3m read
 · 
Why it’s important to fill out this consultation The UK Government is currently consulting on allowing insects to be fed to chickens and pigs. This is worrying as the government explicitly says changes would “enable investment in the insect protein sector”. Given the likely sentience of insects (see this summary of recent research), and that median predictions estimate that 3.9 trillion insects will be killed annually by 2030, we think it’s crucial to try to limit this huge source of animal suffering.  Overview * Link to complete the consultation: HERE. You can see the context of the consultation here. * How long it takes to fill it out: 5-10 minutes (5 questions total with only 1 of them requiring a written answer) * Deadline to respond: April 1st 2025 * What else you can do: Share the consultation document far and wide!  * You can use the UK Voters for Animals GPT to help draft your responses. * If you want to hear about other high-impact ways to use your political voice to help animals, sign up for the UK Voters for Animals newsletter. There is an option to be contacted only for very time-sensitive opportunities like this one, which we expect will happen less than 6 times a year. See guidance on submitting in a Google Doc Questions and suggested responses: It is helpful to have a lot of variation between responses. As such, please feel free to add your own reasoning for your responses or, in addition to animal welfare reasons for opposing insects as feed, include non-animal welfare reasons e.g., health implications, concerns about farming intensification, or the climate implications of using insects for feed.    Question 7 on the consultation: Do you agree with allowing poultry processed animal protein in porcine feed?  Suggested response: No (up to you if you want to elaborate further).  We think it’s useful to say no to all questions in the consultation, particularly as changing these rules means that meat producers can make more profit from sel
 ·  · 11m read
 · 
My name is Keyvan, and I lead Anima International’s work in France. Our organization went through a major transformation in 2024. I want to share that journey with you. Anima International in France used to be known as Assiettes Végétales (‘Plant-Based Plates’). We focused entirely on introducing and promoting vegetarian and plant-based meals in collective catering. Today, as Anima, our mission is to put an end to the use of cages for laying hens. These changes come after a thorough evaluation of our previous campaign, assessing 94 potential new interventions, making several difficult choices, and navigating emotional struggles. We hope that by sharing our experience, we can help others who find themselves in similar situations. So let me walk you through how the past twelve months have unfolded for us.  The French team Act One: What we did as Assiettes Végétales Since 2018, we worked with the local authorities of cities, counties, regions, and universities across France to develop vegetarian meals in their collective catering services. If you don’t know much about France, this intervention may feel odd to you. But here, the collective catering sector feeds a huge number of people and produces an enormous quantity of meals. Two out of three children, more than seven million in total, eat at a school canteen at least once a week. Overall, more than three billion meals are served each year in collective catering. We knew that by influencing practices in this sector, we could reach a massive number of people. However, this work was not easy. France has a strong culinary heritage deeply rooted in animal-based products. Meat and fish-based meals remain the standard in collective catering and school canteens. It is effectively mandatory to serve a dairy product every day in school canteens. To be a certified chef, you have to complete special training and until recently, such training didn’t include a single vegetarian dish among the essential recipes to master. De
Relevant opportunities