I want to draw attention to a tension effective altruists have not dealt with:
- Almost all of our written output takes as a strong assumption that economic growth and technological advancement are good things.
- Many intellectuals think this is actually unclear.
- We invent dangerous new technologies sooner, while society remains unwise, immature and unable to use them safely. Or new dangerous technologies advance from possibilities to realities more quickly, giving us less time to evaluate and limit their risks. For more on this see section 9.4; 2, 3.
- We become richer and this allows for e.g. more destructive conflicts (i.e. poor countries have weaker and less destructive armies).
- Producing more wealth is currently doing more harm than good (e.g. via climate change, other environmental destruction, spread of factory farming or selfish materialism, etc).
- They violate common sense for most people.
- The arguments in their favour are hard to explain quickly.
- Over the last 200 years growth seems to have been a force for good; you look ignorant or deluded to suggest that something that was good in the past will not continue to be good in the future.
- They involve speculation about the direction of future technologies that most people find unpersuasive and unrigorous.
- They can have offensive implications, such as the idea that it would be better for people in poverty today to remain poor, and that the things most people do to improve the world aren't working or are even making things worse.
- Our ability to further raise economic growth or technological advancement is small anyway, because billions of people are already pursuing those goals so we are a tiny fraction of the total.
- Projects focussed on reducing poverty also: raise average global intelligence, education, income, governance, patience, and so on. These 'quality' effects may well dominate.
- Other modelling suggests the overall effect is very unclear (e.g. wars seem to occur less frequently when economic growth is strong; faster growth lowers the number of years spent in any particular state of development, lowering so-called 'state risk'; some technologies clearly lower existing risks, e.g. we could now divert an asteroid away from Earth).
- Imagine that you somehow knew economic growth, or technological advancement, was merely neutral on average. While controversial, some smart people believe this to be true. Would your project nonetheless be one of those that is 'better than average' and therefore a force for good?
- Some things that have been suggested to look good on the 'differential technological development' test include:
- making people more cosmopolitan, kind and cautious;
- improving the ability to coordinate countries and avoid e.g. prisoner's dilemmas;
- increasing wisdom (especially the ability to foresee and solve future problems and conflicts);
- predominantly reducing pressing existing risks, such as climate change;
- predominantly empowering the people with the best values.
Also related, on facebook: https://www.facebook.com/yudkowsky/posts/10151665252179228
Eliezer recently posted advice for central banks on how to accelerate economic growth. I'm not sure if that means he has changed his mind. (Maybe he's deliberately giving them bad advice.)