Some quotes from the latest episode of my podcast, The Inside View. You can access the audio and video in the url linked above.
On Forecasting Nuclear Risk:
"For example, Russia is attacking Ukraine right now, but they're not nuking Ukraine. It's purely with conventional weaponry. So it's certainly possible that a war might stay conventional. Just to throw a number out there is like a 40% chance that it would be nuclear. And then, once you have a nuclear war, it's also not a guarantee that they would hit London. They may only want to hit military targets and avoid civilian targets mainly out of fear of retaliation. And that's called counterforce targeting where you eliminate military targets, versus counter-value targeting where you try to go after huge cities. And so it's not a guarantee that a nuclear war would lead to hitting major popular centers. In fact, I would expect that it would be preferred to be avoided, so maybe like 60% or something. And then you can multiply 0.6 times 0.4 times 0.5, I mean 0.05. And then you would get like a 1% chance or so of there being a nuclear war that targets London. Though, I think actually on reflection, I would put it at under 1%, probably closer to one-third or one-fifth of a percent."
"It definitely is difficult to predict irrational behavior by definition. And in fact, if you're a geopolitical actor, even if you are perfectly rational, you may want to pretend to be irrational, that's kind of the so-called madman theory. That I think even Richard Nixon tried to do at one point. Where you really make it look like that you'd be ready to nuke anyone at any point. And as a result, people really, really try not to anger you and you get more of what you want. So he could be rational to appear irrational, or Putin could just be irrational. And so I guess that gets more into game theory type scenarios"
"I would be a lot more afraid of accidental or intentional nuclear war in the 1980s than I am now. Having seen us successfully navigate the 80's, having seen a ton of nuclear disarmament, having seen stronger norms against nuclear weapons as a geopolitical tool. Of course, that doesn't mean there's no risk. I mean, we've been trying to assess the risk on this podcast and in other articles. And I definitely miss the relative place of the 2010s for example, or at least the early 2010s before Crimea. And so I think things still just don't look nearly as bad as 1980 or 1960."
On EA funding Scalable Non-profits
"the basic summary of the idea is that, if there is a less cost-effective opportunity, but it's more scalable, like it can take on more money, it actually frequently can be better to fund that than to fund a more cost-effective, but less scalable opportunity. The reason being that there's a ton of money available to fund stuff and there's analysis costs in figuring out what to fund and how to fund it. And if you can spend the same amount of analysis to come up with these really scalable opportunities, assuming they meet some cost-effectiveness bar, it'll just use up the entire EA portfolio much faster. And this assumes that using up the portfolio faster is preferable to saving it in a bank count or something like that. So that motivates a lot of my current focus on highly scalable projects as well as growing Rethink Priorities as quickly as is feasible."
“Another important aspect is, things other than money that projects might take up, such as highly talented people that are in short supply or something. So a highly scalable project might also be something where you can do a lot of good even without a lot of highly talented Effective Altruism. Other scarce resources in addition to capital is just about, like making the portfolio do more without having all of our resources just kind of sitting and waiting."
"there are important returns from scalable organizations too, such as, they use up more capital, so you're putting more capital to work instead of saving it. And then also there can frequently be economies of scale. Large companies can just have resources to do more things, you can collaborate with more people, do more things, have bigger things, run things more efficiently through consolidation. There's certainly returns to scale as well. Though, I do worry about bureaucracy and other ways that big organizations might be slow and not work as well. So I think you need to intentionally design with that in mind, as well as have some more nimble opportunities as well."
"I'm definitely trying to intentionally grow Rethink Priorities to be quite large. Because I think that there's a lot of important research questions to answer, and a lot of people that could be good researchers if given the proper training and opportunities. And so I'd like to grow Rethink Priorities to take on more early career researchers and mentor them and scale them up to do important questions."
Note: the probabilities in the above quotes and in the podcast are the result of armchair forecasting. Please do not quote Peter on this. (I want to give some space for my guests to give intuitions about their estimates without having to worry about being extra careful.)