After a year of work, my paper "The Human Biological Advantage Over AI" was recently published by the journal AI & Society:
https://link.springer.com/article/10.1007/s00146-024-02112-w
The paper makes the case that we need to be able to hold two ideas in mind at the same time:
* Yes, AI will become better than humans at almost everything.
* However, the one area in which we will always retain preeminence is the most important of all, and for which there is a complexity limit that will prevent AI from ever surpassing us.
I believe the paper provides a powerful intellectual foundation to justify the position of those concerned about x risk. Because it is one thing to say AI should not replace us, because we are self-interested. But it is much stronger to say that AI should not replace us because it will always be profoundly limited, we will always be able to do more than it can, and those things we can uniquely do are the most important of all.
Comments very welcome.
Go humans!