EAs and longtermists rightly place a lot of emphasis on aligning powerful AI systems with human values. But I have always wondered, if super-human AI starts doing the bidding of some subset of humans, what is the governance equilibrium? When considering this question, two sub-questions stand out to me.
1) Does transformational AI most likely result in the end of democracy? After all, if all useful work is done by AI, most of the leverage held by the average person disappears. They can no longer strike, and protests or revolts may be entirely futile in the face of drone-based weapons and crowd control systems.
2) Is a unipolar or multipolar world more likely? The most powerful AI systems might be developed by American tech companies, the Chinese government, or some other actor. If a multipolar world is possible, how high is the risk of war? In general, it seems that if a state feels like it is facing an existential threat, its willingness to use WMDs or take other drastic measures increases. If, hypothetically, a victory by a Chinese state AI system would imply complete and indefinite subjugation of the American state AI system, and visa-versa, wouldn't the risk of conflict be extraordinarily high?
I'm curious to hear what the EA community has been thinking about these topics, and whether anyone has tried to estimate the likelihood of different governance outcomes in a world with aligned AI.
I'm not saying that this is the only option but simce 1800s we have let the market choose which idea is going to thrive - which service or product gets rewarded.
The hard problem of measuring the future of AI is we don't have a preexisting model for such that once an AGI is let loose for distribution, we cannot see where will it leads us. This is a black swan event as Nassim Taleb described something world altering yet we do not know for now how transformative it can be for the future and beyond.
These are hard questions that is why alignment should be achieved as for us not to worry on how will AGI act and respond in the real world and us not worrying who controls governance of it's code base and infrastructure.