MichaelPlant

Topic Contributions

Comments

Some unfun lessons I learned as a junior grantmaker

This got me wondering: how much agreement is there between grantmakers (assuming they already share some broad philosophical assumptions)?

Because, if the top grants are much better than the marginal grants, and grantmakers would agree over what those are, then you could replace the 'extremely busy' grantmakers with less busy ones. The less busy ones would award approximately the same grants but be able to spend more time investigating marginal giving feedback.

I'm concerned about the scenario where (nearly) all grantmakers are too busy to give feedback and applicants don't improve their projects.

[Linkpost] Towards Ineffective Altruism

On a slightly uncharitable side note, something I find amusing is that it's not long ago we were getting criticised for being overly obsessed with only what could be measured, and that we should be more open to the value of systemic change and such

To be charitable to EA's detractors, it's very possible these are criticisms coming from different people! Some people will be more worried about measurable outcomes, others about systemic change. If EA is getting both kinds of criticisms then it's probably doing better than if it's only getting one type!

"Big tent" effective altruism is very important (particularly right now)

Hello Max,

In turn, I strongly downvoted your post.

Luke raised, you say, some "important issues". However, you didn't engage with the substance of those issues. Instead, you complained that he hadn't adequately separated them even though, for my money, they are substantially related. I wouldn't have minded that if you'd then go on to offer your thoughts on how EA should operate on each of the dimensions you listed, but you did not.

Given this, your comment struck me as unacceptably dismissive, particularly given you are the CEO of CEA. The message it conveys is something like "I will only listen to your concerns if you present them exactly in the format I want" which, again for my money, is not a good message to send.

Most students who would agree with EA ideas haven't heard of EA yet (results of a large-scale survey)

Yes: lots of people agree with EA in principle. But, of those, very few are motivated to do anything. As a suggestion for future research, could you look for what might predict serious commitment in later life?

FWIW, my hunch is the distribution of motivation to be altruistic is not normally distributed, but perhaps even approaching bi-modal: there few people are prepared to dedicate their lives to helping others, but most people will only help if the costs to them are very low. 

Deferring

Okay, well, just to report that what you said by way of clarification was reassuring but not what I picked up originally from your post! I agree with Vaidehi below that an issue was a lack of specificity, which led to me reading it as a pretty general comment.

Reading your other comments, it seems what you're getting at is a distinction between trusting someone is right without understanding why vs just following their instructions. I agree that there's something there: to e.g. run an organisation, it's sometimes impractical or unnecessary to convince someone of your entire worldview vs just ask them to do something.

FWIW, what I see lots of in EA, worries me, and I was hoping your post would be about, is that people defer so strongly to community leaders that they refuse to even engage with object-level arguments against whatever it is that community leaders believe. To draw from a personal example, quite often when I talk about measuring wellbeing, people will listen and then say something to the effect of "what you say seems plausible, I can't think of any objections, but I'm going to defer to GiveWell anyway". Deferring may have a time and a place, but presumably we don't want deference to this extent.

Deferring

Edit: wow why is Michael getting downvoted though, wtf?

Perhaps people didn't like the cult-ish comparison? But criticising someone for saying they are feeling something is cult-ish is, um, well, pretty cult-ish...

Or perhaps it's people who can't properly distinguish between "criticising because you care and want to improve something" and "criticising to be mean" and mistakenly assume I'm doing the latter (despite my strenuous attempts to make it clear I am doing the former). 

Deferring

[Writing in a personal capacity, etc.]

I found this post tone-deaf, indeed chilling, when I read it, in light of the current dynamics of the EA movement. I think its the combination of:

(1)  lots of money appearing in EA (with the recognition this might be a big problem for optics and epistemics and there are already 'bad omens')

(2) the central bits of EA seeming to obviously push an agenda (EA being  ‘just longtermism' now, with CEA's CEO, Max Dalton, indicating their content will be "70-80% longtermism"; CEA's Julia Wise is suggesting people shouldn't talk to high net worths themselves, but should funnel them towards LongView)

(3) this post then saying people should defer to authority.

Taken in isolation, these are somewhat concerning. Taken together, they start to look frightening - of the flavour, "join our elite society, see the truth, obey your leaders".  

I am pretty sure anyone reading this will agree that this is not how we want EA either to be or to be perceived to be. However, things do seem to be moving in that direction, and I don't think this post helped - sorry, Owen, I am sure you wrote it with the best of intentions. But the road to hell, pavements, etc. 

EA and the current funding situation

Will, thanks very much for writing this. It's great to be having this discussion and to see the major players are thinking hard about this. I wanted to raise a couple of issues that merit reflection but haven't (AFAIT) been made so far.

You note that EA has gone from a few guys in a basement to commanding serious funding. But, what might the future of EA be? Where could it be in another 10 years? There could be 10x, or even 100x, of relevant funding. In line with the idea of judicious ambition, how should we be planning for it? Who should be planning for it?

Related to this, how much, and what type, of centralisation and governance are optimal across the various bits of the movement? One thing that strikes me is that 'EA resources' are very centralised: there are only a few major donors, advisors to those donors, and leaders of key organisations, and all those people know each other. What's more, lots of decision-making happens privately. All of this clearly has some major advantages, such as speed and coordination; it's appropriate, given it's about private individuals spending their money; it's also pretty unsurprising that this has happened because EA started so recently.

But, as EA 'grows up', should it transition to operating in some different ways? Some of the risks you flag - reduction in quality of thought, resentment, and the loss of evolutionary forces - seem to stem, at least in part, from this dynamic.

What would the ideal structure be? If I do a Bostromian-Ordian 'reversal test', I wouldn't want to see all 'EA resources' and decision-making concentrated in the hands of one person, no matter who it was. I'm not sure how far the other way would be best, but it seems worth reflecting on.

EA and the current funding situation

Not exactly the same thing, but there was a whole post and discussion on whether EA is "just longtermism" last week. 

EA is more than longtermism

It's possibly worth flagging that these are (sadly) quite long-running issues. I wrote an EA forum post now 5 years ago on the 'marketing gap', the tension between what EA organisations present EA as being about and what those the organisations believe it should be about, and arguing they should be more 'morally inclusive'. By 'moral inclusive', I mean welcoming and representing the various different ways of doing the most good that thoughtful, dedicated individuals have proposed.

This gap has since closed a bit, although not always in the way I hoped for, i.e. greater transparency and inclusiveness.  As two examples, GWWC has been spun off from CEA, rebooted, and now does seem to be cause neutral. 80k is much more openly longtermist.

I recognise this is a challenging issue, but I still think the right solution to this is for the more central EA organisations to actually try hard to be morally inclusive. I've been really impressed at how well GWWC seem to be doing this. I think it's worth doing this for the same reasons I gave in that (now ancient) blogpost: it reduces groupthink, increases movement size, and reduces infighting. If people truly felt like EA was morally inclusive, I don't think this post, or any of these comments (including this one) would have been written.

Load More