I would argue that while it is a 'forked' group it unfortunately retains a number of the basic assumptions one can find in EA. While their presentation is explicitly bullish on AI, this is only superficially different from the implicitly bullish EA community. For example, Bostrom recently said in an interview that people (presumably the alignment group) have overrated the need to regulate and that now we need LESS, not more regulation. This aligns with views that a lot of current 'negative' AI discourse is nothing more than criti-hype (https://sts-news.medium.com/youre-doing-it-wrong-notes-on-criticism-and-technology-hype-18b08b4307e5). I won't even discuss the huge religious themes in transhumanism and the 'church of the singularity' that are so pervasive in both communities.
I would argue that while it is a 'forked' group it unfortunately retains a number of the basic assumptions one can find in EA. While their presentation is explicitly bullish on AI, this is only superficially different from the implicitly bullish EA community. For example, Bostrom recently said in an interview that people (presumably the alignment group) have overrated the need to regulate and that now we need LESS, not more regulation. This aligns with views that a lot of current 'negative' AI discourse is nothing more than criti-hype (https://sts-news.medium.com/youre-doing-it-wrong-notes-on-criticism-and-technology-hype-18b08b4307e5). I won't even discuss the huge religious themes in transhumanism and the 'church of the singularity' that are so pervasive in both communities.