PS

philbert schmittendorf

random guy
14 karmaJoined Working (0-5 years)

Comments
3

I broadly agree with the thesis that AI safety needs to incorporate sociopolitical thinking rather than shying away from it. No matter what we do on the technical side, the governance side will always run up against moral and political questions that have existed for ages. That said, I have some specific points about this post to talk about, most of which are misgivings:

 

  • The claim that environmentalism caused climate change is ....reaching. To say that first wave environmentalism's opposition to nuclear energy slowed our mitigation of climate change is a slightly more defensible claim, but at the time, (1) nuclear waste was still a less solved problem, (2) fossil fuel companies were actively contributing to disinformation campaigns to ensure their continued dominance of the energy market, and (3) even without the opposition to nuclear from the early environmentalist movement, which was far from large enough to effect a systemic change to the entire planet and be held responsible for it, there still would likely have not been enough nuclear plants to overtake fossil fuels in some miraculous, instantaneous energy transition for the same reasons that that doesn't happen now: nuclear energy is capital-intensive, and it takes a long time in human-years to break even.
  • Also, I am suspicious of framing "opposition to geoengineering" as bad -- this, to me, is a red flag that someone has not done their homework on uncertainties in the responses of the climate system to large-scale interventions like albedo modification. Geoengineering the planet wrong is absolutely an X-risk.

 

  •  the "Left wants control, Right wants decentralization" dichotomy here seems not only narrowly focused on Republican - Democrat dynamics, but also wholly incorrect in terms of what kinds of political ideology actually leads one to support one versus the other. Many people on the left would argue uncompromisingly for decentralization due to concerns about imperfect actors cementing power asymmetries through AI. Much of the regulation that the center-left advocates for is aimed at preventing power accumulation by private entities, which could be viewed as a safeguard against the emergence of some well-resourced, corporate network-state that is more powerful than many nation-states. I think Matrice nailed it above in that we are all looking at the same category-level, abstract concerns like decentralization versus control, closed-source versus open-source, competition versus cooperation, etc. but once we start talking about object-level politics -- Republican versus Democratic policy, bureaucracy and regulation as they actually exist in the U.S. -- it feels, for those of us on the left, like we have just gotten off the bus to crazy town. The simplification of "left == big government, right == small government" is not just misleading; it is entirely the wrong axis for analyzing what left and right are (hence the existence of the still oversimplified, but one degree less 1-dimensional, Political Compass...). It seems to me that it is important for us to all step outside of our echo chambers and determine precisely what our concerns and priorities are so we can determine areas where we align and can act in unison.

     

  •  I sense a pretty clear Silicon Valley style right-wing echo-chamber aura from this post in general, and although I will always assume best intentions, I find it worrying that there are more nods to network-states and ideas that tread closely to the neoreactionary ideas of Curtis Yarvin and Nick Land than there are nods to post-capitalist or left-wing visions of the future, such as AI enabling decentralized planned economies or more direct democracies. We need to be careful that the discussions we are having about AI safety as they relate to politics do not end up reflecting, for instance, a lack of diversity of opinion within the Bay Area tech community that is doing the discussing. 

 

 

Some points I liked / agree with:

  • I agree we need to rethink political structures in a much bigger, broader way as it concerns AI. If we bring AGI into the picture, entirely new kinds of economies become possible, so it doesn't make all that much sense to focus on Republican / Democrat dynamics except for pragmatic actions in the near-term. I actually think we should be focusing way more on what the best ways to handle these things in the abstract would be to avoid muddy waters of current events and the political milieu.
  • AI as a policy and moral advisor is among my favorite ideas. I would love to see an AI that can provide an evidence-based case for a given policy over another having taken in the entire corpus of information available on it independent of echo chambers. I do worry about who might build such an AI, though, as if it were not open-source and interpretable, it could be rigged to provide a veneer of rigorous justification for any draconian policy a government prefers to pass.

Nonexistence is preferable to intense suffering, and I think there are enough S-risks associated with the array of possible futures ahead of us that we should prioritize reducing S-risks over X-risks, except when reducing X-risks is instrumental to reducing S-risks. So to be specific, I would only agree with this to the extent that "value" == lack of suffering -- I do not think we should build for the utopia that might not come to pass because we wipe ourselves out first, just that it is vastly more important to prevent dystopia 

A quick counterargument from the alt-protein side: while $100k to an animal welfare nonprofit might alleviate $100k worth of suffering, it isn't going to lead to a state change unless it is facilitating a permanent intervention that meat producers would not have an incentive to reverse. The same amount of money directed toward innovation in cultivated meat is progress toward a potential nonlinear tipping point that could fully displace factory farming, and I don't think we should take it as a guarantee that alt protein technologies will breach to disrupt the meat market without the right amount of wind in their sails.