Stephen Clare

Topic Contributions


Focus of the IPCC Assessment Reports Has Shifted to Lower Temperatures

Major kudos to both of you for this bet. I'll probably refer to this thread in future as a great example of respectful, productive disagreement!

The real state of climate solutions - want to help?

You might be interested in reading some existing discussion of Drawdown, and its limitations, in the comments here.

New 80k problem profile - Climate change

I think you'll find answers to those questions in section 1 of John and Johannes's recent post on climate projections. IIRC the answers are yes, and those numbers correspond to RCP4.5.

New 80k problem profile - Climate change

I think this comment demonstrates the importance of quantifying probabilities. e.g. you write:

Could agriculture cope with projected warming? Possibly, maybe probably. Can it do so while supply chains, global power relations and financial systems are disrupted or in crisis? That's a much harder prospect.

I can imagine either kinda agreeing with this comment, or completely disagreeing, depending on how we're each defining "possibly", "probably", and "much harder".

For what it's worth, I also think it's probably that agriculture will cope with projected warming. In fact, I think it's extremely likely that, even conditional on geopolitical disruptions, the effects of technological change will swamp any negative effects of warming. To operationalize, I'd say something like: there's a 90% chance that global agricultural productivity will be higher in 50 years than it is today.[1]

Note that this is true at the global level. I do expect regional food crises due to droughts. On the whole, I again believe with high confidence (again, like 90%) that the famine death rate in the 21st century will be lower than it was in the 20th century. But of course it won't be zero. I'd support initiatives like hugely increasing ODA and reforming the World Food Program (which is literally the worst).

  1. ^

    I haven't modelled this out and I'd expect that probability would change +/- 10 p.p. if I spent another 15 minutes thinking about it.

Where are the cool places to live where there is still *no* EA community? Bonus points if there is unlikely to be one in the future

That's true, good point. Depending on what they're looking for, I can actually see myself encouraging more people to try this out.

Where are the cool places to live where there is still *no* EA community? Bonus points if there is unlikely to be one in the future

If you like the location you're currently in, it seems pretty worth it to try to hang out with other people in your current community first. Join a sports team or games club or something. If you're worried about incentives, then ask a friend for accountability. Say you'll pay them $20 if you don't actually go to the event and ask them to follow up on it.

I'm a bit worried you're underestimating how difficult it would be to move to an entirely different continent on your own. Life as an expat can be expensive and alienating.

EA and the current funding situation

Can you give an example of communication that you feel suggests "only AI safety matters"?

Can we agree on a better name than 'near-termist'? "Not-longermist"? "Not-full-longtermist"?

I don't think a good name for this exists, and I don't think we need one. It's usually better to talk about the specific cause areas than to try and lump all of them together as not-longtermism.

As you mention, there are lots of different reasons one might choose not to identify as a longtermist, including both moral and practical considerations.

But more importantly, I just don't think that longtermist vs not-longtermist is sufficiently important to justify grouping all the other causes into one group.

Trying to find a word for all the clusters other than longtermism  is like trying to find a word that describes all cats that aren't black, but isn't "not-black cats".

One way of thinking about these EA schools of thought is as clusters of causes in a multi-dimensional space. One of the dimensions along which these causes vary is longtermism vs. not-longtermism. But there are many other dimensions, including  animal-focused vs. people-focused, high-certainty vs low-certainty, etc. Not-longtermist causes all vary along these dimensions, too. Finding a simple label for a category that includes animal welfare, poverty alleviation, metascience, YIMBYism, mental health, and community building is going to be weird and hard.

"Not-longtermism" would just be everything outside of some small circle in this space. Not a natural category.

It's because there are so many other dimensions that we can end up with people working on AI safety and people working on chicken welfare in the same movement. I think that's cool.  I really like that EA space has enough dimensions that a  really diverse set of causes can all count as EA. Focusing so much on the longtermism vs. not-longtermism dimension under-emphasizes this.

Load More