Totalitarianism

Discuss the topic on this page. Here is the place to ask questions and propose changes.
Comments6
Sorted by

I think it'd be good to rename this tag "authoritarianism". I have the impression that EAs/longtermists have often focused more on totalitarianism than authoritarianism, or have used the terms as if they were somewhat interchangeable. But it seems to me that totalitarianism is best considered as a subtype of authoritarianism, and that other types of authoritarian regime also have the potential to cause problems in similar ways. So I think it'd be best to default to the more inclusive term “authoritarianism”, except when a person really has a specific reason to focus on totalitarianism specifically.

(I elaborate on this a bit here)

I don't have a strong opinion on this one. I think I may have a slight preference for "totalitarianism" since to me paradigmatically totalitarian regimes better resemble the scenarios that most worry EAs and longtermists. I will copy below the rough draft I had for this article, in case it helps decide on whether the name should be changed (please excuse the lack of proper formatting).

--

** Characteristics
Benito Mussolini famously characterized totalitarianism as "all within the state, nothing outside the state, none against the state." [fn:1] Contemporary scholars have listed several distinctive features of totalitarian regimes. These features include a radical official ideology, usually exclusionary and future-oriented; a single party, typically led by one man; a monopoly of the means of both persuasion and coercion; a centrally planned economy, in which most professional activities are part of the state; and extreme politization and widespread use of terror in all spheres of life (Friedrich & Brzezinski 1965: 22; Aron 1965: ch. 15; Holmes 2001). Totalitarian regimes are estimated to have been responsible for the deaths of over 125 million people in the 20th century (Bernholz 2000: 568). To this tragic loss of life needs to be added the major loss of quality of life experienced by those living under such regimes.

** Robust totalitarianism as a catastrophic and existential risk
Because of its scale, the threat of robust totalitarianism constitutes a [[*Global catastrophic risk][global catastrophic risk]]. If the totalitarian regime has the potential to be both global and stable, it could also constitute an [[*Existential risk][existential risk]]—specifically a risk of an unrecoverable [[*Dystopia][dystopia]].

Advances in [[*Artificial intelligence][artificial intelligence]] in areas such as lie detection, social persuasion and deception, autonomous weapons and ubiquitous surveillance could entrench existing totalitarian regimes. These developments may also cause democracies to slide into totalitarianism (Dafoe 2018: sect. 4.1). On the other hand, AI could conceivably destabilize totalitarian systems or protect against their emergence (Adamczewski 2019: sect. 'Robust totalitarianism'). To this date, no detailed analysis exists of the potential impact of artificial intelligence on the risk of robust totalitarianism. The literature on robust totalitarianism in general is itself very small (Caplan 2008). Research in this area thus appears to be of high expected value (Koehler 2020: sect. 'Risks of stable totalitarianism').
 

Another possibility is to have separate articles on each concept. I don't know if this is a good idea—just mentioning it as something to keep in mind.

to me paradigmatically totalitarian regimes better resemble the scenarios that most worry EAs and longtermists

To me, I think it's very plausible that the main ways in which totalitarian regimes would be bad for the long-term future would also be mostly shared by authoritarian regimes, and essentially just have to do with making premature or bad lock-in more likely and a long reflection less likely. Both systems seem less conducive to developing new ideas and correcting errors (including moral catastrophes) over time than e.g. democratic societies.

But I don't think there's been any detailed writeup arguing either for this position I'm saying or for the position you suggest. So I guess this is a tricky case where there's been very little work on a topic, such that the wiki kind-of has to take its own stance, or where the only way to not take its own stance is to stick closely to the tiny amount of existing work and thereby (in my view) kind-of replicate its oversights. 

---

I guess maybe if I publish my research agenda and a small, quickly written post outlining associated thoughts, then the entry could reference that and draw on the scope and ideas in that? 

Or in situations like this, would it make more sense for the wiki to deviate from Wikipedia by kind-of containing "original research" (via me adding or changing some paragraphs in a way that isn't based on published work I've seen but rather on my own thoughts - though I think those thoughts are things various people have thought themselves as well)?

Obviously a third option is just to stick closer to the "totalitarianism" framing and not put that much emphasis on what a single person - me - is saying here.

---

It seems like the EA Wiki is in a bit of a different situation from Wikipedia here, since (1) a substantial fraction of the users of the Wiki will themselves be a substantial fraction of the people generating the research that some articles are based on, and (2) a substantial fraction of the sources cited would count as "self-published" sources of the sort Wikipedia doesn't generally approve of citing. So it seems worth thinking about whether and how we might want to deviate from the "no original research" policy Wikipedia has.

I think the two topics get a small enough fraction of total EA attention and overlap in what they are and why they matter to a sufficient extent that it's probably not currently worth having an entry for each topic.

I think another option is an article on authoritarianism that actually mostly discusses totalitarianism, but makes it clear that this is a subtype and that other types might matter too.