Note: I found out that there have been detailed expert reviews of Founders Pledges work in the past. See Johannes Ackvas comment below. Also, despite the focus on Founders Pledge, this doesn't mean that I think that they are especially under-reviewed. More generally, I don't mean to criticise FP - I just wanted to share this argument and see what people think about it.
Last time I checked, I couldn't find any in-depth, expert reviews of the cost-effectiveness estimates of Founders Pledge. I know that there isn't one for their evaluation of the Clean Air Task Force. GivingGreen and SoGive have looked at some of them, but not in-depth. (They told me this in correspondence). So, assuming I didn't miss anything, there are two possibilities:
(i) they didn't have any such reviews or
(ii) they had reviews, but didn't publish them
Lets first assume (i) is the case.
There seem to be large potential benefits of getting reviewed. If such an expert would find out that FP is significantly off, then this is valuable information, because it might lead investors to change the amount they donate. If FP underestimated e.g. CATFs cost-effectiveness, they might shift funding from less effective opportunities towards the CATF and if they overestimated it's cost-effectiveness the reverse might happen. Either way, if an expert review uncovers that the size of the error is sufficiently large, it is not implausible that this would improve large funding decisions.
If, however, such an expert verifies FPs models, then this is valuable information too. In that case, their research seems much more trustworthy from the outside, which plausibly attracts more investors. This is especially true, for cost-effectiveness estimates that seem spectacular. Take the claim that CATF averts 1 ton of CO2e for less than a dollar. Many people outside of EA that I talked to were initially skeptical of this. (I'm not making any claim as to the reasonableness of that estimate, I am merely commenting on its public perception.)
So it seems like there are large potential benefits of getting an expert review for organisation like FoundersPledge (all I'm saying here might well apply to many other similar organisation that I'm not as familiar with).
The remaining question is then: are the expected benefits of an independent analysis justifying its costs?
I assume that you can hire an expert researcher for less than 100$/hour and that such an analysis would take less than 4 full work weeks. At 40 hours/week the whole thing would cost less than 16.000 $. That seems unrealistically high, but let’s assume it's not.
Estimating that tens to hundreds of millions of dollars are allocated on the basis of their recommendations, it still seems worth it, doesn’t it?
Edit: I have probably overestimated this amount quite a lot. See Linch's comment
Now lets assume that (ii) is the case - they had independent expert reviews, but didn't publish them. In that case, for the exact reasons given above, it would be important to make them public.
What do you make of this argument?
Gotcha
I mean, how long is a piece of string? :) The way I did my reviewing was to check the major assumptions and calculations and see if those made sense. But where a report, say, took information from academic studies, I wouldn't necessarily delve into those or see if they had been interpreted correctly.
Re making things public, that's a bit trickier than it sounds. Usually I'd leave a bunch of comments in a google doc as I went, which wouldn't be that easy for a reader to follow. You could ask someone to write a prose evaluation - basically like an academic journal review report - but that's quite a lot more effort and not something I've been asked to do.
In HLI, we have asked external academics to do that for us for a couple of pieces of work, and we recognise it's quite a big ask vs just leaving gdoc comments. The people we asked were gracious enough to do it, but they were basically doing us a favour and it's not something we could keep doing (at least with those individuals). I guess one could make them public - we've offered to share ours with donors, but none have asked to see them - but there's something a bit weird about it: it's like you're sending the message "you shouldn't take our word for it, but there's this academic who we've chosen and paid to evaluate us - take their word for it".