New Comment
12 comments, sorted by Click to highlight new comments since: Today at 4:06 AM

This is a small write-up of when I applied for a PhD in Risk Analysis 1.5 years ago. I can elaborate in the comments!

I believed doing a PhD in risk analysis would teach me a lot of useful skills to apply to existential risks, and it might allow me to direectly work on important topics. I worked as a Research Associate on the qualitative ide of systemic risk for half a year. I ended up  not doing the PhD because I could not find a suitable place, nor do I think pure research is the best fit for me. However, I still believe more EAs should study something along the lines of risk analysis, and its an especially valuable career path for people with an engineering background.

Why I think risk analysis is useful:

EA researchers rely a lot on quantification, but use a limited range of methods (simple Excel sheets or Guesstimate models). My impression is also that most EAs don't understand these methods enough to judge when they are useful or not (my past self included). Risk analysis expands this toolkit tremendously, and teaches stuff like the proper use of priors, underlying assumptions of different models, and common mistakes in risk models.

The field of Risk Analysis

Risk analysis is a pretty small field, and most is focused on risks of limited scope and risks that are easier to quantify than the risks EAs commonly look at. There is a Society of Risk Analysis (SRA), which manages the Journal of Risk Analysis (the main journal of this field). I found most of their study topics not so interesting, but it was useful to get an overview of the field, and there were some useful contacts to make (1). The EA-aligned org GCRI is active and well-established in SRA, but no other EA orgs are.

Topics & advisers

I hoped to work on GCR/X-risk directly, which substantially reduced my options. It would have been useful to just invest in learning a method very well, but I was not motivated to research something not directly relevant. I think it's generally difficult to make an academic career as a general x-risk researcher, and it's easier to research 1 specific risk. However, I believes this leaves open a number of cross-cutting issues.

I have a shortlist of potential supervisors I considered/contacted/was in conversation with, including in public policy and philosophy. I can provide this list privately on request.

Best grad programs:

The best background for grad school seems to be mathematics or more specifically, engineering. (I did not have this, which excluded a lot of options). The following 2 programs seemed most promising, although I only investigated PRGS in depth:


(1) For example, I had a nice conversation with the famous psychology researcher Paul Slovic, who now does research into the psychology involved in mass atrocities.

Aww yes, people writing about their life and career experiences! Posts of this type seem to have some of the best ratio of "how useful people find this" to "how hard it is to write" -- you share things you know better than anyone else, and other people can frequently draw lessons from them.

I have a concept of paradigm error that I find helpful.

A paradigm error is the error of approaching a problem through the wrong, or an unhelpful, paradigm. For example, to try to quantify the cost-effectiveness of a long-termism intervention when there is deep uncertainty.

Paradigm errors are hard to recognise, because we evaluate solutions from our own paradigm. They are best uncovered by people outside of our direct network. However, it is more difficult to productively communicate with people from different paradigms as they use different language.

It is related to what I see as

  • parameter errors (= the value of parameters being inaccurate)
  • model errors (= wrong model structure or wrong/missing parameters)

Paradigm errors are one level higher: they are the wrong type of model.

Relevance to EA

I think a sometimes-valid criticism of EA is that it approaches problems with a paradigm that is not well-suited for the problem it is trying to solve.

I think I call this "the wrong frame".

"I think you are framing that incorrectly etc"

eg in the UK there is often discussion of if LGBT lifestyles should be taught in school and at what age. This makes them seem weird and makes it seem risky. But this is the wrong frame - LGBT lifestyles are typical behaviour (for instance there are more LGBT people than many major world religions). Instead the question is, at what age should you discuss, say, relationships in school. There is already an answer here - I guess children learn about "mummies and daddies" almost immediately. Hence, at the same time you talk about mummies and daddies, you talk about mummies and mummies, and single dads and everything else. 

By framing the question differently the answer becomes much clearer. In many cases I think the issue with bad frames (or models) is a category error.

I like this, I think i use the wrong models when trying to solve challenges in my life.

Update to my Long Covid report:

UPDATE NOV 2022: turns out the forecast was wrong and incidence (new cases) is decreasing, severity of new cases is decreasing, and significant amounts of people are recovering in the <1 year category. I now expect prevalence to be stagnating/decreasing for a while, and then slowly growing over the next few years.]

I still believe the other sections to be roughly correct, including long-term immune damage from COVID for 'fully recovered' people.

I'm predicting a 10-25% probability that Russia will use a weapon of mass destruction (likely nuclear) before 2024. This is based on only a few hours of thinking about it with little background knowledge.

Russian pro-war propagandists are hinting at use of nuclear weapons, according to the latest BBC podcast Ukrainecast episode. [Ukrainecast] What will Putin do next? #ukrainecast via @PodcastAddict

There's a general sense that, in light of recent losses, something needs to change. My limited understanding sees 4 options:

  1. Continue on the current course despite mounting criticism. Try to make the Ukrainians lives difficult by targeting their infrastructure, limit losses until winter, and try to reorganize during winter. This seems a pretty good option for now, even though I doubt Russia can really shore up its deeply set weaknesses. They can probably prepare to dig in, threaten and punish soldiers for fleeing. This wouldn't go well for either party long-term, but Russia might bet on outlasting/undermining Western support. Probability: 40%?

  2. Negotiation: I don't think Putin wants this seriously, as even the status quo could be construed as a loss. Ukraine will have a strong bargaining position and demand a lot. Undesirable option. Maybe 10%? 20%? (Metaculus predicts 8% before 2023:

  3. Full-scale mobilisation of the population and the economy. This is risky for Putin: there's supposedly a large anti-war sentiment in Russian culture, a legacy of the enormous losses during the 2nd World War. People don't like to join a poorly-equipped poorly managed and losing army, even if it were a good cause.. This may be chosen, Putin may be misinformed and badly reading the public's sentiment. I have no idea how this would develop internally. I doubt it will make a big difference in the course of the war, except by prolonging the war a bit. Maybe 25%? Maybe 50% if Putin underestimates public resistance.

  4. Escalation by other means: I don't know how many options Russia has. Chemical weapons, electro magnetic pulse, a single tactical nuclear strike on the battlefield for deterrence, multiple nuclear strikes for strategic reasons, population strike for deterrence. In the mind of Putin, I can see this as preferable: it leads to a potential military advantage, has limited risk for destabilising his internal power base. I don't know how the international community would respond to this, nor how Putin thinks the international community would respond. In my (uninformed) view, only China can make a real difference here as the West already has stringent sanctions. I don't know how China would respond to this. They wouldn't like it, but I think the West won't really punish China for its support in the short term. I'd say on this inside view, 10-25% seems reasonable. I'm setting the point estimate at 15%.


I'm still willing to bet on this)

Ah sorry I'm not going to do that, mix of reasons. Thanks for offering it though :)

Any chance you'd wanna bet on this + find a suitable escrow (ideally crypto)? I could bet non-trivial sum on opposite side, imo odds less than 15%

Large study: Every reinfection with COVID increases risk of death, acquiring other diseases, and long covid.

We are going to see a lot more issues with COVID still, including massive amounts of long COVID.

This will affect economies worldwide, as well as EAs personally.