Peter Thiel & Eric Weinstein discuss global catastrophic risks, including biosecurity and AI alignment, starting at around the 2:33:00 mark of Thiel's interview on Weinstein's new podcast.

tl;dl – Thiel thinks GCRs are a concern, but is also very worried about political violence / violence perpetrated by strong states. He thinks catastrophic political violence is much more likely than GCRs like AI misalignment.

He has some story about political violence becoming more likely when there's no economic growth, and so is worried about present stagnation. (Not 100% sure I'm representing that correctly.)

Also there's an interesting bit about transparency & how transparency often becomes weaponized when put into practice, soon after the GCR discussion.

Sorted by Click to highlight new comments since:

Economic growth likely isn't stagnating, it just looks that way due to some catch up growth effects:

I think how the 'middle class' (a relative measure) of the USA is doing is fairly uninteresting overall. I think most meaningful progress at the grand scale (decades to centuries) is how fast is the bottom getting pulled up and how high can the very top end (bleeding edge researchers) go. Shuffling in the middle results in much wailing and gnashing of teeth but doesn't move the needle much. Their main impact is just voting for dumb stuff that harms the top and bottom.

Great point.

I like the Russ Roberts videos as demonstrations of how complicated macro is / how malleable macroeconomic data is.

Thiel thinks GCRs are a concern, but is also very worried about political violence / violence perpetrated by strong states.

Robin Hanson's latest (a) is related.

Given the stakes, it's a bit surprising that "has risk of war secularly declined or are we just in a local minimum?" hasn't received more attention from EA.

Holden looked at this (a) a few years ago and concluded:

I conclude that [The Better Angels of Our Nature's] big-picture point stands overall, but my analysis complicates the picture, implying that declines in deaths from everyday violence have been significantly (though probably not fully) offset by higher risks of large-scale, extreme sources of violence such as world wars and oppressive regimes.

If I recall correctly, Pinker also spent some time noting that violence appears to be moving to more of a power-law distribution since the early 20th Century (fewer episodes, magnitude of each episode is much more severe).

"War aversion" seems like a plausible x-risk reduction focus area in its own right (it sorta bridges AI risk, biosecurity, and nuclear security).

This chart really conveys the concern at a glance:


(source) (a)

... what if the curve swings upward again?

Hacker News comments about the interview, including several by Thiel skeptics.

Also Nintil has some good notes (a). (Notes at bottom of post.)

Curated and popular this week
Relevant opportunities