K

kierangreig

Chief Strategy Analyst @ Rethink Priorities
1605 karmaJoined Feb 2015

Bio

Kieran Greig is the Chief Strategy Analyst at Rethink Priorities. He works with the Leadership Team to advise on Rethink Priorities’ strategy and execution, at all levels. Prior to that, he was the Director of Research for Farmed Animal Funders, a group of large donors who each give over $250,000 annually to end factory farming. He previously worked as a researcher at Animal Charity Evaluators, and prior to that was a co-founder of Charity Entrepreneurship and Charity Science Health. He has written about topics like improving the welfare of farmed fish and supporting plant-based alternatives to animal products. He has a B. Sc. from Monash University and a Masters from La Trobe University.

Posts
24

Sorted by New

Comments
116

Sure. Before doing that a couple of quick notes. First, I think it takes a while for grants to mature and impact to play out, so that makes it difficult to judge at this point which were the biggest hits from the past year. Second, there are some grants that I have a COI with, but think may have been hits from 2022, but nonetheless won’t list them here. Third, as some further background context, the general categories of grants that I am most excited by are early-stage support to aligned groups, working on neglected animals, or in neglected places. 

Then being quite brief, some hits that come to mind for me: 

  • Our early 2022 grant of $185,000 to the team at EA Singapore to run a capacity-building fellowship throughout SE Asia. They have run multiple iterations of a 14-week fellowship, have had tens of people go through that fellowship, and partnered with tens of organizations on the fellowship. They were renamed as Welfare Matters, and secured a $1M grant from Open Phil earlier this year. In my view, the movement basically went from not really having a very promising movement-building option for much of SE Asia (which is a critical area for the movement[1]) to having one in the space of 1-2 years, in large part because of the EA AWF. 
     
  • Our early 2022 grant of $45,000 grant to the Shrimp Welfare Project (SWP). I think our support of SWP was quite important to their growth and they went on to secure larger grants from us, and be ACE-recommended. In my view, the movement went from not really having a promising option for addressing shrimp welfare, to having one in the space of 1-2 years, in large part because of the EA AWF. And by SWP’s estimates, they already now (in expectation) help 1 billion shrimps per annum.    
     
  • Various grants we have made to support chicken welfare campaigns internationally seem to have been quite good investments in my view (e.g., Sinergia Animal and Çiftlik Hayvanlarını Koruma Derneği).  
     

I hope that gives some more sense of the potential upside but feel free to follow up further. Also as noted in response to a different question, there is variance in views within fund managers. Here I am reporting my own relatively quickly put views, while others on the fund may have slightly to somewhat different views.    

  1. ^

    Throughout much of SE Asia there's little to no organized effective animal advocacy. Many of these countries are highly populous and some of the largest animal product producers. Somewhat dated but see p.16 re: total number of farmed animals. 

Hi David, thanks for engaging! Responding to your questions below. 

  1. Yes, sometimes we do this if we think that some opportunity is a particularly good fit for a different funder. Yeah, I would say applicants usually remain active long enough to reapply again in the future. It seems rare to me that our funding is the deciding factor for their continued existence. 
  2. I can’t easily pull that information, and I think it depends on the year, but my very rough guess is between 3-30% of funds raised in any given year of the fund’s existence so far were crypto-related. Note that variance too, wherein some years we don’t receive any major crypto-associated donations, and other years our largest donations are crypto-related. When I think about fluctuations for this year or next, I am more thinking about, as Luke put it in one comment

One thing that I've found when speaking with pledgers/donors and other people in the charitable sector is that economic conditions are also a very important part of the story. This has held since early 2022 when the dollar value of large donations dropped when financial markets and crypto markets declined (our larger donors tend to be more likely to have made money in tech or finance instead of other industries). 

3. As far as I know this has never happened. 

Hope that helps. 

In terms of the present funding allocation, it is much more focused on farmed than wild animals. An important factor contributing to that is there are very few opportunities that we can support on the wild animal side at this point. The promising opportunities for wild animals that exist now receive funding from us and are some of our bigger grantees. But there’s only so far we can go with research there, and we haven’t yet identified some promising wild animal welfare intervention that groups could implement. That contributes to there being significantly fewer grantmaking opportunities on that side of things right now.  

In contrast, on farmed animals there do seem to be various grants that can be made now around a) implementing promising interventions, b) coordinating around promising interventions, or c) building up the field in order to do more promising interventions later. At this point, such opportunities don’t seem to exist to nearly the same extent on the wild animal side.    

Also note there is in general some variance in views within fund managers and that adds some nuance to describing overall views on this, and other questions related to the “fund’s current views.” Here I am reporting my own relatively quickly put views, while others on the fund may have slightly to somewhat different views.   

Thanks for the question, Vasco :) 

It is possible to donate specifically to a single area of RP?

Yes. Donors can restrict their donations to RP. When making the donation, the donor should just mention what restriction is on the donation, and then we will restrict those funds for only that use in our accounting.

If yes, to which extend would the donation be fungible with donations to other areas?

The only way this would be fungible is if it changes how we allocate unrestricted money. Based on our current plans, this would not happen for donations to our animal welfare or longtermist work but could happen for donations to other areas. If this is a concern for you, please flag that and we can actually go and increase the budget for the area by the size of your donation, thus fully eliminating all fungibility concerns completely.

We take donor preferences very seriously and do not think fungibility concerns should be a barrier to those giving to RP. That being said, we do appreciate those that trust us to allocate money to where we think it is needed most.

Is AWF considering hiring a fundraiser to help fill this funding gap?


No, not considering hiring for strictly a fundraiser at this point. However, we are interested in adding other positions that could contribute to fundraising (as well importantly contribute in other ways). 

Specifically as mentioned in the post:

We also have some plans for significant growth next year through some internal expansion plans in the works (e.g., possibly adding further fund managers, hopefully at least one who is full-time, and doing more active grantmaking). 

To that end, recently we posted looking for guest fund managers. In addition to that, I am hopeful we will also do some further recruitment efforts next year. 

Sadly, I don't think that approach is correct. The 5th percentile of a product of random variables is not the product of the 5th percentiles---in fact, in general, it's going to be a product of much higher percentiles (20+).

As something of an aside, I think this general point was demonstrated and visualised well here

Disclaimer: I work RP so may be biased.  

It still depends somewhat on how fundraising goes, but it's pretty likely in 2024 Rethink Priorities budget (excluding a number of groups that we fiscally sponsor) will be around $11M. 

I think that the specific extrapolation of our budget completed here was importantly off because we did a number of hires over the course of 2022, so the reported spend for that year didn't fully capture total recurring costs of the new headcount (as those new hires started at various points throughout that year). 

Thanks for your engagement! 

Yes, for instance, as mentioned in the appendix, some non-fictitious examples for Global Health and Development are: 

We produced numerous research reports for Open Phil assessing the potential of global health and development interventions, looking for interventions that could be as or more cost-effective as the ones currently ranked top by GiveWell. This included full reports on the following:

  • The effectiveness of large cash prizes in spurring innovation (the report was also shared with FTX Future Fund, and another large foundation).
  • The badness of a year of life lost vs. a year of severe depression.  
  • Scientific research capacity in sub-Saharan Africa.
  • The landscape of climate change philanthropy.
  • Energy frontier growth (this report explores several of the key considerations for quantifying the potential economic growth benefits of clean energy R&D).
  • Funding gaps and bottlenecks to the deployment of carbon capture, utilization, and storage technologies.
  • A literature review on damage functions of integrated assessment models in climate change.
  • A confidential project that we won’t give further details on.
  • Detailing the process of the World Health Organizations’s prequalification process for medicines, vaccines, diagnostics and vector control, as well as the potential impact of additional funding in this area.
  • Describing the World Health Organization’s Essential Medicines List and the potential impact of additional funding in this area.
  • Whether Open Phil should make a major set of grants to establish better weather forecasting data availability in low- and middle-income countries (LMICs).
  • Further examination of hypertension, including its scale and plausible areas a philanthropist could make a difference.    

And for AI Governance and Strategy respectively, some examples could include the following: 

Ongoing projects include the following: (Note: this list isn’t comprehensive and some of these will soon result in public outputs.)

  • Developing what’s intended to be a comprehensive database of AI policy proposals that could be implemented by the US government in the near- or medium-term. This database is intended to capture information on these proposals’ expected impacts, their levels of consensus within longtermist circles, and how they could be implemented.
  • Planning another Long-term AI Strategy Retreat for 2023, and potentially some smaller AI strategy events.
  • Thinking about what the leadup to transformative AI will look like, and how to generate economic and policy implications from technical people’s expectations of AI capabilities growth.
  • Mentoring AI strategy projects by promising people outside of our team who are interested in testing and building their fit for AI governance and strategy work.
  • Preparing a report on the character of AI diffusion: how fast and by what mechanisms AI technologies spread, what strategic implications that has (e.g. for AI race dynamics), and what interventions could be pursued to influence diffusion.
  • Surveying experts on intermediate goals for AI governance.
  • Investigating the tractability of bringing about international agreements to promote AI safety and the best means of doing so, focusing particularly on agreements that include both the US and China.
  • Investigating possible mechanisms for monitoring and restricting possession or use of AI-relevant chips.
  • Assessing the potential value of an AI safety bounty program, which would reward people who identify safety issues in a specified AI system.
  • Writing a report on “Defense in Depth against Catastrophic AI Incidents,” which makes a case for mainstream corporate and policy actors to care about safety/security-related AI risks, and lays out a “toolkit” of 15-20 interventions that they can use to improve the design, security, and governance of high-stakes AI systems.
  • Experimenting with using expert networks for EA-aligned research.
  • Trying to create/improve pipelines for causing mainstream think tanks to do valuable longtermism-aligned research projects, e.g. via identifying and scoping fitting research projects.
Load more