Have you asked GPI and FHI's macrostrategy team whether they have suggestions for kinds of prioritization research (if any) that you could usefully do? This is a difficult kind of research to do, and LEAN/SHIC/Peter don't have a track record of generating important prioritization considerations in the same way as these other organizations.
The criteria I used for making these grants was as follows: (1) Have clear “room for more funding” ... (2) Have a clear risk of not meeting their funding goal... (3) Clear a bar of being “impactful enough”...represent outstanding opportunities that I think are better than the community average
I am very uninformed about organizations... working on existential risk and far future. My impression, however, is that OpenPhil has done a good job filling up the funding gaps in this area and that there are very few organizations that would meet the criteria I’m using for these recommendations.
I think this says about all that needs to be said about whether this kind of search procedure is likely to yield optimal donation targets!
The problem is that some EAs would have the amount of life in the universe reduced to zero permanently. (And don't downvote this unless you personally know this to be false - it is unfortunately true)
If not, then it it is a necessary example, plain and simple.
But it is not necessary - as you can see elsewhere in this thread, I raised an issue without providing an example at all.
"An issue"? Austen was referring to problems where an organization affiliates with particular organizations that cause terror risk, which you don't seem to have discussed anywhere. For this particular issue, FRI is an illustrative and irreplaceable example, although perhaps you could suggest an alternative way of raising this concern?
He wrote a single sentence pointing out that the parent comment was giving FRI an unfair and unnecessary treatment. I don't see what's "ill founded" about that.
What's ill-founded is that if you want to point out a problem where people affiliate with NU orgs that promote values which increase risk of terror, then it's obviously necessary to name the orgs. Calling it "unnecessary" to treat that org is then a blatant non-sequitur, whether you call it an argument or an assertion is up to you.
Why is it more important now than in normal discourse? If someone decides to be deliberately obtuse and disrespectful, isn't that the best time to revert to tribalism and ignore what they have to say?
Our ability to discern good arguments even when we don't like them is what sets us apart from the post-fact age we're increasingly surrounded by. It's important to focus on these things when people are being tribal, because that's when it's hard. If you only engage with facts when it's easy, then you're going to end up mistaken about many of the most important issues.
I really don't like how you are accusing people without evidence of intentionally promoting violence. This is borderline libel. I agree that someone could take their ideology and use it to justify violence, but I see no reason to believe that they are intentionally trying to "entice" such actions.
Indeed, must focus on the battles we can win. There are two traps. One is to make false accusations. Currently, few negative utilitarians are promoting terrorism, and we should not make accusations that would suggest otherwise. Two is to stir up controversy. Telling negative utilitarians that they are terrorists could inflame them into actually behaving in a more hostile manner. It is like when people say that naming "radical islamic terrorism" is necessary to solve the problem. Perhaps, but it would be more useful to engage cooperatively with the religion of Islam to show that it is a religion of piece, and the same for utilitraianism.
The safe position that we should expect EA leaders to vigilantly oppose is not to promote values whose adoption would lead to large-scale terrorism. This is the hill that we should choose to die on. Specifically, if negative utilitarians believe in cooperation, and they believe that value-spreading is important, then they should be cooperative in the values that they spread. And this does not allow for spreading values that would lead to actions that are overwhelmingly repulsive to the vast majority of ethicists andd the general population on an astronomical scale. EA leaders must include CEA.
This would be bad, if true, given that it is essentially the same complaint that was levelled against Julia Wise, of community health, by Alexey Guzey previously.