Peter Thiel & Eric Weinstein discuss global catastrophic risks, including biosecurity and AI alignment, starting at around the 2:33:00 mark of Thiel's interview on Weinstein's new podcast.
tl;dl – Thiel thinks GCRs are a concern, but is also very worried about political violence / violence perpetrated by strong states. He thinks catastrophic political violence is much more likely than GCRs like AI misalignment.
He has some story about political violence becoming more likely when there's no economic growth, and so is worried about present stagnation. (Not 100% sure I'm representing that correctly.)
Also there's an interesting bit about transparency & how transparency often becomes weaponized when put into practice, soon after the GCR discussion.
Economic growth likely isn't stagnating, it just looks that way due to some catch up growth effects:
https://rhsfinancial.com/2019/01/economic-growth-speeding-up-or-slowing/
Seems like there's dispute about this, at least from Russ Roberts' perspective:
https://www.policyed.org/numbers-game/hows-middle-class-doing/video
https://www.policyed.org/numbers-game/paradox-household-income/video
I think how the 'middle class' (a relative measure) of the USA is doing is fairly uninteresting overall. I think most meaningful progress at the grand scale (decades to centuries) is how fast is the bottom getting pulled up and how high can the very top end (bleeding edge researchers) go. Shuffling in the middle results in much wailing and gnashing of teeth but doesn't move the needle much. Their main impact is just voting for dumb stuff that harms the top and bottom.
Great point.
I like the Russ Roberts videos as demonstrations of how complicated macro is / how malleable macroeconomic data is.
Robin Hanson's latest (a) is related.
Given the stakes, it's a bit surprising that "has risk of war secularly declined or are we just in a local minimum?" hasn't received more attention from EA.
Holden looked at this (a) a few years ago and concluded:
If I recall correctly, Pinker also spent some time noting that violence appears to be moving to more of a power-law distribution since the early 20th Century (fewer episodes, magnitude of each episode is much more severe).
"War aversion" seems like a plausible x-risk reduction focus area in its own right (it sorta bridges AI risk, biosecurity, and nuclear security).
This chart really conveys the concern at a glance:
(source) (a)
... what if the curve swings upward again?
Hacker News comments about the interview, including several by Thiel skeptics.
Also Nintil has some good notes (a). (Notes at bottom of post.)