This post will be direct because I think directness on important topics is valuable. I sincerely hope that my directness is not read as mockery or disdain towards any group, such as people who care about AI risk or religious people, as that is not at all my intent. Rather my goal is to create space for discussion about the overlap between religion and EA.
–
A man walks up to you and says “God is coming to earth. I don’t know when exactly, maybe in 100 or 200 years, maybe more, but maybe in 20. We need to be ready, because if we are not ready then when god comes we will all die, or worse, we could have hell on earth. However, if we have prepared adequately then we will experience heaven on earth. Our descendants might even spread out over the galaxy and our civilization could last until the end of time.”
My claim is that the form of this argument is the same as the form of most arguments for large investments in AI alignment research. I would appreciate hearing if I am wrong about this. I realize when it’s presented as above it might seem glib, but I do think it accurately captures the form of the main claims.
Personally, I put very close to zero weight on arguments of this form. This is mostly due to simple base rate reasoning: humanity has seen many claims of this form and so far all of them have been wrong. I definitely would not update much based on surveys of experts or elites within the community making the claim or within adjacent communities. To me that seems pretty circular and in the case of past claims of this form I think deferring to such people would have led you astray. Regardless, I understand other people either pick different reference classes or have inside view arguments they find compelling. My goal here is not to argue about the content of these arguments, it’s to highlight these similarities in form, which I believe have not been much discussed here.
I’ve always found it interesting how EA recapitulates religious tendencies. Many of us literally pledge our devotion, we tithe, many of us eat special diets, we attend mass gatherings of believers to discuss our community’s ethical concerns, we have clear elites who produce key texts that we discuss in small groups, etc. Seen this way, maybe it is not so surprising that a segment of us wants to prepare for a messiah. It is fairly common for religious communities to produce ideas of this form.
–
I would like to thank Nathan Young for feedback on this. He is responsible for the parts of the post that you liked and not responsible for the parts that you did not like.
Hey! I liked certain parts of this post and not other parts of this post. I appreciate the thoughtfulness by which you critique EA through this post.
On your first point about the AI messiah:
I think the key distinction is that there are many reasons to believe this argument about the dangers of an AGI are correct, though. Even if many claims with a similar form are wrong, that doesn't exclude this specific claim from being right.
"Climate scientists keep telling us about how climate change is going to be so disastrous and we need to be prepared. But humanity has seen so many claims of this form and they've all been so wrong!"
The key distinction is that there is a lot of reason to believe that AGI will be dangerous. There is also a lot of reason to support the claim that we are not prepared currently. Without addressing that chain of logic directly, I don't think I'm convinced by this argument.
On your second point about the EA religious tendencies:
Because religious communities are one of the most common communities we see, there's obviously going to be parallels that exist between religious communities and EA.
Some of these analogies hold, others not so much. We, too, want to community build, network, and learn from each other. I'd love for you to point at specific examples of things EA do, from conferences to holding EA university groups, that are ineffective or unnecessary.
To perhaps a greater point of EA perhaps becoming too groupthink-y, which I think may be warranted:
I think a key distinction is that EA has a healthy level of debate, disagreement, and skepticism - while religions tend to demand blind faith in believing something unprovable. This ongoing debate on how to do the most good I personally find the most valuable in the community - and I hope this spirit never dies.
Keep on critiquing EA; I think such critiques are extremely valuable. Thanks for writing this.
I'm pretty confused here. On the one hand, I think it's probably good to have less epistemic deference and more independent thinking in EA. On the other, I think if I take your statements literally and extend them, I think they're probably drawing the boundaries of "religious" way too broadly, in mostly-unhelpful ways.
... (read more)