Hide table of contents

Suppose we're sometime in the (near-ish) future. The longtermist project hasn't fulfilled 2020's expectations. Where did we go wrong? What scenarios (and with what probabilities) may have lead to this?

I hope this question isn't strictly isomorphic to asking about objections to long-termism.

18

0
0

Reactions

0
0
New Answer
New Comment

1 Answers sorted by

Both of the possibilities below don't seem to be things that it would be that easy to realise even once we're in some (near-ish) future. I hope this isn't begging the question, it isn't intended to be. I've put credences and I'm glad you asked for them, but they are very uncertain.

One possibility is that we were just wrong about the whole long-termism thing. Given how much disagreement in philosophy there seems to be about basically everything, it seems prudent to give this idea non-trivial credence, even if you find arguments for long-termism very convincing. I'd maybe give a 10% probability to long-termism just being wrong.

More significant seems to be the chance that long-termism was right but that trying to directly intervene in the long-term future by taking actions that were only expected to have consequences in the long term was a bad strategy, and instead we should have been (approximate credence):

  • Investing money to be spent in the future (10%)
  • Investing in the future by growing the EA community (25%)
  • Doing the most good posible in the short term for the developing world/animals, as this turns out to positively shape the future more than actually trying to. (20%)
I'd maybe give a 10% probability to long-termism just being wrong.

What could you observe that would cause you to think that longtermism is wrong? (I ask out of interest; I think it's a subtle question.)

3
alex lawsen
A really convincing argument from a philosopher or group of philosophers I respected would probably do it, especially if it caused prominent longtermists to change their minds. I've no idea what this argument would be, because if I could think of the argument myself it would already have changed my mind.
1
Eli Rose
Makes sense!

What about a scenario where long-termism turns out to be right, but there is some sort of community-level value drift which results in long-term cause areas becoming neglected, perhaps as a result of the community growing too quickly or some intra-community interest groups becoming too powerful? I wouldn't say this is very likely (maybe 5%), but we should consider the base rate of this type of thing happening.


I realise that this outcome might be subsumed in the the points raised above. Specifically, it might be that instead of directly trying to int... (read more)

Great comment. I count only 65 percentage points - is the other third "something else happened"?

Or were you not conditioning on long-termist failure? (That would be scary.)

1
alex lawsen
I was not conditioning on long termist failure, but I also don't think my last three points are mutually exclusive, so they shouldn't be naively summed.
1
Azure
Additionally, is it not likely that those scenarios are correlated?
More from Azure
Curated and popular this week
Relevant opportunities