A

andiehansen

12 karmaJoined Sep 2019

Bio

I'm pursuing an economics degree at the University of Alberta, Canada. I've started a small EA club and facilitated EA Virtual Programs several times.

Comments
2

As an EA group facilitator, I've been a part of many complex discussions talking about the tradeoffs between prioritizing long-term and short-term causes.

Even though I consider myself a longtermist, I now have a better understanding and respect for the concerns that near-term-focused EAs bring up. Allow me to share a few of them.

  1. The world has finite resources, so when you direct resources to long-term causes, those same resources cannot be put towards short-term causes. If the EA community was 100% focused on the very long term, for example, then it's likely that solvable problems in the near-term affecting millions or billions of people would get less attention and resources, even if they were easy to solve. This is especially true as EA gets bigger, having a more outsized impact on where resources are directed. As this post says, marginal reasoning becomes less valid as EA gets larger.
  2. Some long-term EA cause areas may increase the risk of negative outcomes in the near-term. For example, people working on AI safety often collaborate with and even contribute to capabilities research. AI is already a very disruptive technology and will likely be even moreso as its capabilities become more powerful.
  3. People who think "x-risk is all that matters" may be discounting other kinds of risks, such as s-risks (suffering risks) due to dystopian futures. If we prioritize x-risk while allowing global catastrophic risks (GCRs) to increase (that is, risks which don't wipe out humanity but greatly set back civilization), that increases s-risks because it's very hard to have well-functioning institutions and governments in a world crippled by war, famine, and other problems.

These and other concerns have updated me towards preferring a "balanced portfolio" of resources spread across EA causes from different worldviews, even if my inside view prefers certain causes over others.

See this similar question here for other ways to coordinate. As for me, I'm a Canadian in Alberta interested in helping out, whether financially or with figuring out the process. Please reach out to me and let me know what you have in mind.