But technology that powerful could just as easily be directed toward curing aging or creating unprecedented material abundance for billions of people as it could be directed toward destroying us.
This seems to imply that alignment is just very easy, but a core part of Yudkowsky's argument is that it is not. Curing aging is a very specific narrow goal, destroying humanity is a side effect of almost any possible goal of a large enough scale. The later paragraphs argue more in favor of current alignment techniques working well, but even if you believe that, "just as easily" is still a massive overstatement.
This seems to imply that alignment is just very easy, but a core part of Yudkowsky's argument is that it is not. Curing aging is a very specific narrow goal, destroying humanity is a side effect of almost any possible goal of a large enough scale. The later paragraphs argue more in favor of current alignment techniques working well, but even if you believe that, "just as easily" is still a massive overstatement.