Given the radical uncertainty of the more distant future, we can’t know how to achieve preferred goals with any kind of certainty over longer time horizons. Our attachment to particular means should therefore be highly tentative, highly uncertain, and radically contingent.
Our specific policy views, though we may rationally believe them to be the best available, will stand only a slight chance of being correct. They ought to stand the highest chance of being correct of all available views, but this chance will not be very high in absolute terms. Compare the choice of one’s politics to betting on the team most favored to win the World Series at the beginning of the season. That team does indeed have the best chance of winning, but most of the time it does not end up being the champion. Most of the time our sports predictions are wrong, even if we are good forecasters on average. So it is with politics and policy.
Our attitudes toward others should therefore be accordingly tolerant. Imagine that your chance of being right is three percent, and your corresponding chance of being wrong is ninety-seven percent. Each opposing view, however, has only a two percent chance of being right, which of course is a bit less than your own chance of being right. Yet there are many such opposing views, so even if yours is the best, you’re probably still wrong. Now imagine that your wrongness will lead to a slower rate of economic growth, a poorer future, and perhaps even the premature end of civilization (not enough science to fend off that asteroid!). That means your political views, though they are the best ones out there, will have grave negative consequences with probability .98 (one minus two percent, the latter being the chance that you are right on the details of the means-end relationships). In this setting, how confident should you really be about the details of your political beliefs? How firm should your dogmatism be about means-ends relationships? Probably not very; better to adopt a tolerant demeanor and really mean it.
As a general rule, we should not pat ourselves on the back and feel that we are on the correct side of an issue. We should choose the course that is most likely to be correct, keeping in mind that at the end of the day we are still more likely to be wrong than right. Our particular views, in politics and elsewhere, should be no more certain than our assessments of which team will win the World Series. With this attitude political posturing loses much of its fun, and indeed it ought to be viewed as disreputable or perhaps even as a sign of our own overconfident and delusional nature.
Stubborn Attachments, Chapter 6—Must uncertainty paralyze us?
In strict Bayesian terms, most innovators are not justified in thinking that their new ideas are in fact correct. Most new ideas are wrong and the creator’s “gut feeling” that he is “onto something” is sometimes as epistemologically dubious as is the opinion of the previous scientific consensus. Yet we still want that they promote these new ideas, even if most of them turn out to be wrong.
In this view, the so-called “reasonable” people are selfishly building up their personal reputations at the expense of scientific progress. They are too reasonable to generate new ideas.
To put it another way, there are two kinds of truth-seeking behavior:
- Hold and promote the view which leads to society most likely settling upon truth in the future, or
- Hold and promote the view which is most likely to be correct.
These two strategies coincide less than many people think.
"Are there reasons to be dogmatic?", Marginal Revolution.
I once in a book described myself as a two-thirds utilitarian. Someone asked me if I changed my views since then. And I think by now I'm down to 63 percent and not two-thirds anymore. I was joking, but also not joking.
That said, I find the sharper versions—you could almost call them the more dogmatic versions of effective altruism—the most effective. If you just smush everything into the big jumble of pluralistic values, it's just another movement doing a bunch of charity, which I'm fine with. But I think there's something sharp and legible about wanting to apply the utilitarian calculation to every decision that I find very useful and invigorating, even though it's not actually my own view.
So I don't think I'm actually rooting for effective altruism to become more like my views. I want it to stay more different from my views. In that sense, I want there to be this greater intellectual diversity. And I think it's been effective precisely because it's been somewhat extreme and somewhat forcing.
And again, a lot of this is a reaction, an understandable reaction to SBF and those issues. But people saying, oh, we need to speak up and show we care about all these values and this and that. And I agree with that. But I'd actually rather hear people say, well, is there some chance that by defrauding everyone and giving away the money for a short period of time, he did more good than harm? I'd much rather have people debating that question. I think that's the kind of short question that's actually needed at the margin, not: “how do we all become a kind of goody two shoes who embrace all values in response to any bad publicity that pops up?”