One of the reasons highly useful projects don't get discovered quickly is that they are in under explored spaces. Certain areas are systematically under explored due to biases in peoples' search heuristics. Several examples of such biases are:
1) Schlep blindness: named by Paul Graham, posits that difficult projects are under explored.
2) Low-status blindness: projects which are not predicted to bring the project lead prestige are under explored.
3) High-variance blindness: projects which are unlikely to succeed but that have a positive expected value anyway are under explored.
4) Already invented blindness: projects that cover areas that have already been explored by others are assumed to have been competently explored.
5) Not obviously scalable blindness: projects that don't have an obvious route to scaling are under explored.
Are there other biases in what EAs pay attention to?
I believe this is useful because a project checking a lot of boxes in this space is *some* evidence that it is worth pursuit: few others will be interested in it giving you a comparative advantage.
Several of these might be summed up under the heading "high risk." There is a notion that this is exactly what philanthropy (as opposed to governments) ought to be doing.
One area I think hits many of these: global income inequality.
I don't blame governments for not pursuing such things. I've never thought of philanthropy, or how others think of philanthropy, to be about pursuing high-risk altruism. I've alwasy thought of philanthropy as wealthy people with big hearts trying to help people in a way that tugged at their heartstrings, patronizing something they're passionate about, such as research to cure a particular disease or works of fine art they enjoy, or to signal their magnaminity, i.e., giving for the sake of conspicuity.
How common do you think this notion is that philanthrop... (read more)