E.g. What is the expected effect on existential risk of donating to one of GiveWell's top charities?
I've asked myself this question several times over the last few years, but I've never put a lot of thought into it. I've always just assumed that at the very least it would not increase existential risk.
Have any analyses been done on this?
Following Brian Tomasik's thinking, I believe that one of the big issues for existential risk is international stability and cooperation to deal with AI arms races and similar issues, so to answer this question I asked something along those lines on Reddit, and got an interesting (and not particularly optimistic) answer.
https://www.reddit.com/r/IRstudies/comments/3jk0ks/is_the_economic_development_of_the_global_south/
Maybe one can think that getting through the unsteady period of economic development quickly will hasten the progress of the international community whereas delaying it would simply forestall all the same problems of instability and competition. I don't know. I wish we had more international relations experts in the EA community.
I've been thinking lately that nuclear non-proliferation is probably a more pressing x-risk than AI at the moment and for the near term. We have nuclear weapons and the American/Russian situation has been slowly deteriorating for years. We are (likely) decades away from needing to solve AI race global coordination problems.
I am not asserting that AI coordination isn't critically important. I am asserting that if we nuke ourselves first, it probably won't matter.