Prof. David Thorstad has a new blog post out about how effective altruists sometimes use examples in misleading ways to argue for EA causes. Looking at the examples used more carefully often undermines the arguments for those causes. I've cross-posted this post here because I think it highlights the need for EAs to develop stronger arguments about existential risks.
Here's the summary from the introduction:
Today, I want to look at the role of examples in discussions by effective altruists. Effective altruists offer many striking examples of small changes that could be made to yield large benefits for current or future people. We are invited to imagine that these diamonds in the rough can be easily mined, if only a courageous reader has the will (and funds) to help.
The problem is that diamonds in the rough are rare. Many of the examples presented by effective altruists are substantially more complex than they appear. Once these examples are unpacked, it is no longer obvious that they support the original point being made. Let’s look at two ways that examples are misused, and passed down across texts, in discussions of biological weapons and bioterrorism.
For example, EAs like Toby Ord often suggest "just enforce the Biological Weapons Convention" as a quick fix to reduce the risks from bioweapons. But in reality, international actors have tried to beef up the BWC before, and many different countries have resisted it:
Many texts written by effective altruists give the impression that mitigating biorisk would be deceptively simple: just give more money to the body in charge of enforcing the Biological Weapons Convention, an international treaty prohibiting the stockpiling of many types of biological weapons....
The thing about diamonds in the rough is that they tend to be rare and hard to find. When someone tells you they are the first one to find a diamond, it is worth digging around and asking whether others have found it first but judged it to be a doozy.
In fact, the weakness of the Biological Weapons Convention is extremely well-known and has survived persistent efforts at correction. The problem is not that no willing donor has found a few million dollars to pony up for a good cause. The problem is that there are strong political obstacles to enforcing the Biological Weapons Convention, and these political obstacles reveal clear downsides to increased enforcement that never seem to show up on effective altruists’ ledgers.
This doesn't mean that strengthening the BWC is not worthwhile, but that it is much more difficult than many EAs believe.
On Aum Shinrikyo:
It might seem easy to wheel out examples of omnicidal bioterrorists. But oddly enough, effective altruists always seem to wheel out the same example. Aum Shinrikyo is a Japanese doomsday cult that carried out a series of sarin gas attacks in the 1990s, based on a belief in the need to bring about a cleansing Armageddon in which non-believers would be killed.
Many effective altruists, including several who should (and probably do) know better, suggest that Aum Shinrikyo was omnicidal, aiming to destroy all living humans....
...it would be quite difficult to replace the example of Aum Shinrikyo (hence the repeated reliance of different authors on this example).
The problem is that it is simply not true to say that Aum Shinrikyo wanted to bring about human extinction. They wanted to bring about the extermination of nonbelievers: the followers of Aum were very much meant to survive....
Aum Shinrikyo did not merely intend to survive the apocalypse. They intended to rebuild their own communities, then all of Japan, and then the entire world.
One can quibble with this counterargument - for example, I would rejoin that even if a terrorist group like Aum Shinrikyo intended to kill everyone but themselves, they might overshoot and kill off humanity anyway. However, that the only example of a supposedly omnicidal terrorist group that EAs tend to cite was not actually an omnicidal terrorist group suggests that literally omnicidal terrorist groups are not a credible existential threat.