willbradshaw

willbradshaw's Comments

CU & extreme suffering

Assuming "within one current human" also implies that maximum suffering and happiness are defined as something like "the most unhappy/happy a human has ever been in history up to this point", I would guess that the correct amount of time to spend at the worst suffering in exchange for 50 years of the best wellbeing would probably be much less than 50 years. Maybe 1-5 years? But my confidence intervals are very wide here.

Climate Change Is Neglected By EA

In fact you suggested below some good arguments for this

Two things:

  1. I made those up on the spur of the moment. Possibly you're just being polite, but I would be very suspicious if all three turned out to be good arguments supporting work on climate change. I said immediately below that I don't especially believe any of them in the case of climate change.
  2. More importantly, the whole point of coming up with those arguments was that they didn't depend on claims about neglectedness! None of those are arguments that climate change is neglected, they are potential shapes of arguments for why you might want to prioritise it despite it not being neglected.

I feel like we're still not connecting regarding the basic definition of neglectedness. You seem to be mixing it up with scale and tractability in a way that isn't helpful to precise communication.

Are there historical examples of excess panic during pandemics killing a lot of people?

That is interesting. My general model is that pre-modern Europeans didn't need much of an excuse to start killing Jews, so if true this would be a substantial update for me.

There are various things I could come up with that might start explaining the difference, but I'd want to actually read the paper first.

Climate Change Is Neglected By EA

That does sound about right to me.

Could you elaborate a bit on why? This doesn't sound insane to me, but it is a pretty big disagreement with 80,000 Hours, and I am more sympathetic to 80K's position on this.

My claim is that EA currently (1) downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts) and (2) downplays the value of working on climate change (e.g. low neglectedness, low tractability). If you agree that (1, 2) are true, then EA is misleading its members about climate change and biasing them to work on other issues.

Perhaps I have misunderstood your argument, but I think you're saying that (1, 2) don't matter because lots of people already care about climate change, so EA doesn't need to influence more people to work on climate change. I would argue that regardless of how many people already care about climate change, EA should seek to communicate accurately about the impact and importance of work on different cause areas.

My claim is that the fact that so many (smart, capable) people care about climate change work directly causes it to have lower expected value (on the margin). The "impact and importance of work on different cause areas" intimately depends on how many (smart, capable) people are already working or planning to work in those areas, so trying to communicate that impact and importance without taking into account "how many people already care" is fundamentally misguided.

The claim that climate change is a major PR issue for EA, if true, is evidence that EA's position on climate change is (in at least this one respect) correct.

Any good organizations fighting racism?

If your primary target is specifically institutional racism in developed countries, then I agree, which is why I suggested it.

I'm not sure if that's the right thing to prioritise, though. If your goal is to reduce disparities between ethnic groups globally, or even to tackle harm from ethnic discrimination globally, I'd guess you can do better elsewhere, in particular in the developing world.

Are there historical examples of excess panic during pandemics killing a lot of people?

This (not wanting to lose credibility by being perceived to overreact) was my thought as well.

I'm not claiming this is the case, but I think if a public health person said "we're worried about causing panic" when they actually meant "we're worried about being seen to overreact", I would consider that quite dishonest.

Effective Altruism is a Big Tent

Could someone maybe do something about the formatting on this post? It seems like it's valuable to keep around and link to, but currently I find it pretty hard to read (all bold, big spaces between paragraphs, some paragraphs that are plausibly blockquotes but I can't really tell).

Any good organizations fighting racism?

Personally, I would want to get a more operationalised definition of "fighting racism" before going deeper on this question.

But, guessing as to what counts and what doesn't, I'd suggest checking out OpenPhil's criminal justice reform grantees.

Climate Change Is Neglected By EA

Yeah, I think many groups struggle with the exact boundary between "marketing" and "deception". Though EAs are in general very truthful, different EAs will still differ both in where they put that boundary and their actual evaluation of climate change, so their final evaluation of the morality of devoting more attention to climate change for marketing purposes will differ quite a lot.

I was arguing elsewhere in this post for more of a strict "say what you believe" policy, but out of curiosity, would you still have that reaction (to the gateway/PR argument) if the EA in question thought that climate change thought was, like, pretty good, not the top cause, but decent? To me that seems a lot more ethical and a lot less patronising.

Climate Change Is Neglected By EA

It is worth noting that a lot of core EAs have pivoted from global poverty to X-risk, a major shift in priorities, without ever changing their position on climate change (something that a priori seems important from both perspectives). This isn't necessarily wrong, but does seem a bit suspicious.

Given the fact that climate change is somewhat GCR/X-risky, it wouldn't surprise me if it were more valuable on the margin than anti-malaria work. But both the X-risk people and the global poverty people seem sceptical about climate change work; that intersection is somewhat surprising, but I think is a major part of my own scepticism.

Like, if you have two groups of people, and one group says "we should definitely prioritise A and B, but not C or D, and probably not E either", and the other group says "we should definitely prioritise C and D, but not A or B, and probably not E either", it doesn't seem like it's looking good for E.

But I might be reading that all wrong, and everyone things that climate change is, like, the fourth best cause, and as a result it should get more points even though nobody thinks it's top? This sounds like one of those moral uncertainty questions.

Load More