Nathan Young

Project manager/Director @ Frostwork (web app agency)
17524 karmaJoined Working (6-15 years)London, UK
nathanpmyoung.com

Bio

Participation
4

Builds web apps (eg viewpoints.xyz) and makes forecasts. Currently I have spare capacity. 

How others can help me

Talking to those in forecasting to improve my forecasting question generation tool

Writing forecasting questions on EA topics.

Meeting EAs I become lifelong friends with.

How I can help others

Connecting them to other EAs.

Writing forecasting questions on metaculus.

Talking to them about forecasting.

Sequences
1

Moving In Step With One Another

Comments
2550

Topic contributions
20

Interesting take. I don't like it. 

Perhaps because I like saying overrated/underrated.

But also because overrated/underrated is a quick way to provide information. "Forecasting is underrated by the population at large" is much easier to think of than "forecasting is probably rated 4/10 by the population at large and should be rated 6/10"

Over/underrated requires about 3 mental queries, "Is it better or worse than my ingroup thinks" "Is it better or worse than my ingroup thinks?" "Am I gonna have to be clear about what I mean?"

Scoring the current and desired status of something requires about 20 queries "Is 4 fair?" "Is 5 fair" "What axis am I rating on?" "Popularity?" "If I score it a 4 will people think I'm crazy?"...

Like in some sense your right that % forecasts are more useful than "More likely/less likely" and sizes are better than "bigger smaller" but when dealing with intangibles like status I think it's pretty costly to calculate some status number, so I do the cheaper thing.

 

Also would you prefer people used over/underrated less or would you prefer the people who use over/underrated spoke less? Because I would guess that some chunk of those 50ish karma are from people who don't like the vibe rather than some epistemic thing. And if that's the case, I think we should have a different discussion.

I guess I think that might come from a frustration around jargon or rationalists in general. And I'm pretty happy to try and broaden my answer from over/underrated - just as I would if someone asked me how big a star was and I said "bigger than an elephant". But it's worth noting it's a bandwidth thing and often used because giving exact sizes in status is hard. Perhaps we shouldn't have numbers and words for it, but we don't.

Sure but a really illegible and hard to search one.

I guess lots of money will be given. Seems reasonable to think about the impacts of that. Happy to bet.

This is an annoying feature of search:

Sure, seems plausible. 

I guess I kind of like @William_MacAskill's piece or as much as I remember of it. 

My recollection is roughly this: 

  • Yes, it's strange to have lots more money.
  • Perhaps we're spending it badly.
  • But also seeking not to spend enough money might be a bad thing, too.
  • Frugal EA had something to recommend it.
  • But more impact probably requires more resources. 

This seems good, though I guess it feels like a missing piece is: 

  • Are we sure this money is got ethically?
  • How much harm will getting this money for bad reasons hurt us? 

Also, looking back @trammell's takes have aged very well:

  • It is unlikely we are in the most important time in history
  • If not, it is good to save money for that time

Had Phil been listened to, then perhaps much of the FTX money would have been put aside, and things could have gone quite differently. 

So my non-EA friends point out that EAs have incentives to suck up to any group that are about to become rich. This seems something which I haven't seen a solid path through:

  • It is much more effective to deal with the people who have the most money.
  • It is hard to retain one's virtue while doing so. 

Having known, and had conflict with a number of wealthy people, it is hard to retain ones sense of integrity in the face of lifechanging funds. I've talked to SBF and even after the crash I felt a gravity that I didn't want to insult him lest he one day return to the heights of his influence. Sometimes that made me too cautious, sometimes, avoiding caution I was reckless. 

I guess in some sense the problem is that finding ways through uncomfortable situations requires sitting in discomfort, and I don't find EA to have a lot of internal battery for that kind of thing. Have we really resolved most of the various crises in a way that created harmony between those who disagreed? I'm not sure we have. So it's hard to be optimistic here. 

Naaaah, seems cheems. Seems worth trying. If we can't then fair enough. But it doesn't feel to me like we've tried.

Edit, for specificity. I think that shrimp QALYs and human QALYs have some exchange rate, we just don't have a good handle on it yet. And I think that if we'd decided that difficult things weren't worth doing we wouldn't have done a lot of the things we've already done.

Also, hey Elliot, I hope you're doing well.

Reading Will's post about the future of EA (here) I think that there is an option also to "hang around and see what happens". It seems valuable to have multiple similar communities. For a while I was more involved in EA, then more in rationalism. I can imagine being more involved in EA again.

A better earth would build a second suez canal, to ensure that we don't suffer trillions in damage if the first one gets stuck. Likewise, having 2 "think carefully about things movements" seems fine. 

It hasn't always felt like this "two is better than one" feeling is mutual. I guess the rationalist in me feels slighted by EA discourse around and EA funder treatment of rationalist orgs over the years. But maybe we can let that go and instead be glad that should something go wrong with rationalism, that EA will still be around.

I do not see 14 charity ranking tools. I don't really think I see 2? What, other than asking claude/chatGPT/gemini are you suggesting? 

Could you give a concise explanation of what giving circles are?

Load more