If you search the forum for the EAIF tag you can get some more details on past grants.  I'm not sure if this gives you quite what you're looking for, or not.


The reading time estimates on lesswrong crossposts seem to be wrong.  For example, this says 1 but should be 5-10 (I would guess):


Answer by TynerSep 23, 202210

Doesn't really make sense to me and would lead to some very weird conclusions.  

For example, I'm a manager and one of my staff talks to me and says this report takes a very long time because there are many manual steps, I believe we could automate these steps by using software X which I've used in a previous role and costs $Y.  By the logic of this maxim I should ignore the proposed solution AND ignore the initial complaint.

Because they proposed a solution, now I should think it less likely that the report takes a very long time?  Seems totally nonsensical (or I'm not understanding what you're actually saying).

The discussion on Erik Hoel's piece is here:


>a monthly feature of "humans of EA", showing a wide range of people

really like this idea

Hi Fai, I agree with whoever encouraged you to post more.  I always enjoy and appreciate your stuff even when we don't 100% agree. 

The below sentence is difficult to parse, what do you actually mean?  That it was economic reasons, or that it was not economic reasons, or something else entirely?

>Well, I personally did not have much hope in humanity's moral progress, until I recently got moderately convinced that it’s less likely than not that we abolished slavery mainly for economic reasons. And in case you think that it is impossible to have moral progress without economic reasons. I tend to disagree, and Will Macaskill also. He wrote in What We Owe The Future that the view that it was economic incentives caused by new technologies that cause slavery to be abolished, is now out of fashion in academia. He thinks that it was pretty much the triumph of the abolitionists. So there's a reason to think that moral progress is a genuine alternative to technologically forced social progress.


Expanding our exploitation of animals is a moral step backward.  This does not seem like the kind of project EA people or organizations should be supporting.

>100 such ideas

here's another with the same vibes


Load More