Nathan Young

Product Management @ Forecasting Consultancy
15019 karmaJoined May 2019Working (0-5 years)London, UK



Create prediction markets and forecasting questions on AI risk and biorisk. I also work part-time at a prediction market.

Use my connections on Twitter to raise the profile of these predictions and increase the chance that decision-makers discuss these issues.

How others can help me

Talking to those in forecasting to improve my forecasting question generation tool

Writing forecasting questions on EA topics.

Meeting EAs I become lifelong friends with.

How I can help others

Connecting them to other EAs.

Writing forecasting questions on metaculus.

Talking to them about forecasting.


Moving In Step With One Another


Topic contributions

Interesting take. I don't like it. 

Perhaps because I like saying overrated/underrated.

But also because overrated/underrated is a quick way to provide information. "Forecasting is underrated by the population at large" is much easier to think of than "forecasting is probably rated 4/10 by the population at large and should be rated 6/10"

Over/underrated requires about 3 mental queries, "Is it better or worse than my ingroup thinks" "Is it better or worse than my ingroup thinks?" "Am I gonna have to be clear about what I mean?"

Scoring the current and desired status of something requires about 20 queries "Is 4 fair?" "Is 5 fair" "What axis am I rating on?" "Popularity?" "If I score it a 4 will people think I'm crazy?"...

Like in some sense your right that % forecasts are more useful than "More likely/less likely" and sizes are better than "bigger smaller" but when dealing with intangibles like status I think it's pretty costly to calculate some status number, so I do the cheaper thing.


Also would you prefer people used over/underrated less or would you prefer the people who use over/underrated spoke less? Because I would guess that some chunk of those 50ish karma are from people who don't like the vibe rather than some epistemic thing. And if that's the case, I think we should have a different discussion.

I guess I think that might come from a frustration around jargon or rationalists in general. And I'm pretty happy to try and broaden my answer from over/underrated - just as I would if someone asked me how big a star was and I said "bigger than an elephant". But it's worth noting it's a bandwidth thing and often used because giving exact sizes in status is hard. Perhaps we shouldn't have numbers and words for it, but we don't.

Yeah the voting on these posts feels pretty bizarre. Though I try not to worry about that. It usually comes out in the wash to something that seems right.

Well I have spend like an hour on your post and only just found the policy proposals. Why not put them at the top? Or with a heading? 

Also I don't really see what your policies have to do with ending poverty - seems like if successful these would be taken up in the west and then there would still be huge amounts of poverty.

I agree that many people will downvote your piece without reading it (though many will upvote for the same reason, and it seems there is some of both going on here) but I really did try and read it and it was soooooo long and very unclear. Maybe my thoughts don't matter to you, but if you want my advice, clear writing involves the audience taking away what you intended from the piece. I don't think you've succeed with me, despite my spending 30 - 60 minutes on it. 

I have thought about this a bit and chatted to people (eg thanks @titotal) and I think there is some missing mood in my responses which is people feeling like they don't want to have to battle this stuff all the time and that the arguments are often long and complicated but wrong. 

Eg I care about truth and so do Jehovas Witnesses, but I don't think it's worthwhile to let them in my house - I can predict that argument isn't going to change my mind or theirs, but it will cost a load of time and perhaps emotional energy.

This doesn't fully change my mind, per se, I still think censoring is the wrong call, but perhaps I lower my bar on what would be a bad situation. eg if there was a post like this every week at 5 karma with huge arguments about journals that most of the community guess are rubbish. Or if sources were debunked and then replaced with similar but equally poor sources. I sense this happens a lot in IQ debates and I don't really have time for that personally. I would have even less energy if I felt the upshot of these discussions was a set of policy proposals that seemed abhorrent to me/ felt like a dicsussion of my value as a person.

Unsure what the answer is here, but seems meaningful to note the change.

These seem pretty reasonable questions to me. 

There are very strong consequentialist reasons for acting with integrity


we should be a lot more benevolent and a lot more intensely truth-seeking than common-sense morality suggests

It concerns me a bit that when legal risk appears suddenly everyone gets very pragmatic in a way that I am not sure feels the same as integrity or truth-seeking. It feels a bit similar to how pragmatic we all were around FTX during the boom. Feels like in crises we get a bit worse at truth seeking and integrity, though I guess many communities do. (Sometimes it feels like in a crisis you get to pick just one thing and I am not convinced the thing the EA community picks is integrity or truth seekingness) 

Also I don't really trust my own judgement here, but while EA may feel more decentralised, a lot of the orgs feel even more centralised around OpenPhil, which feels a bit harder to contact and is doing more work internally. This is their prerogative I guess, but still. 

I am sure while being a figurehead of EA has had a lot of benefits (not all of which I guess you wanted) but I strongly sense it has had a lot of really large costs. Thank you for your work. You're a really talented communicator and networker and at this point probably a skilled board member so I hope that doesn't get lost in all this. 

It seems there was a lot of information floating around but no one saw it as their responsibility to check whether SBF was fine and there was no central person for information to be given to. Is that correct? 

Has anything been done to change this going forward? 

Do you think the legal advice was correct? Or is it possible it was wrong to you?

If it was worth spending X millions on community building, feels like it may have been worth risking X/5 on lawsuits to avoid quite a lot of frustration.

It seems like when there is a crisis, the rationalists perhaps talk too much (the SSC NYT thing perhaps) but EA elites clam up and suddenly go all "due diligence" not sure that's the right call either. (Not that I would do better).

It resolved to my personal credence so you shouldn’t take that more seriously than “nathan thinks it unlikely that”

This doesn't feel like a great response to me.

Load more