Discuss the wiki-tag on this page. Here is the place to ask questions and propose changes.
Maybe in future this entry should draw a bit on discussion (within or outside EA) of "unintended consequences" of the kinds described here.
I am confused as to how this relates to trajectory changes (https://forum.effectivealtruism.org/tag/trajectory-changes). When Beckstead (2013) talks about ripple effects, I understand him to talk about trajectory changes, ie., a certainclass of interventions which might be very effective for longtermists, compared to x-risk mitigation. Independent of this and whether one agrees with longtermism, it might be still relevant to think about info hauards, replacability (the bullet point). I would suggest that the first paragraph should be moved to trajectory changes instead. Sorry, if I have overseen something.
The first sentence of this article had been:
Indirect long-term effects (also called flow-through effects (Karnofsky 2013; Karnofsky et al. 2013; Shulman 2013; Wiblin 2016), ripple effects (Beckstead 2013; Whittlestone 2017), knock-on effects (Gaensbauer 2016; Greaves 2016; Snowden 2017) and cascading effects) are effects on the long-run future from interventions targeted at the short-term.
But many of the terms in brackets were not necessarily limited to effects on the long-run future from interventions targeted at the short-term. E.g., I think some or all of those terms could've also been used to describe things like unintended effects in the coming decades of bednet distribution, such as (maybe) more meat consumption, more greenhouse gas emissions, more economic growth, or more innovation.
The sentence also fit a lot of info in brackets mid-way through it.
So I've now split it into two and tweaked it to be more consistent with the idea that those other terms might not be about a totally identical concept.
Thank you, that looks good.
I am copying below the original contents of the 'Future considerations' article, which we decided to delete for being redundant, in case some of it should be incorporate here, or into some other article.
The value of any action taken today will depend on what happens in the future. This is of course true in a trivial sense. For instance, if we were to discover that there was a meteor hurtling for Earth, and that humanity had only a few years of life left, then this would decrease the expected value of work on climate change.
However, future events can also determine the value of present actions in more subtle ways. First, some actions taken to address present-day problems may turn out to have long-term indirect effects that dwarf their short-term impact. For example, work that lessens the burden of disease in the developing world could have an economic impact that compounds across generations.
Second, the value of progress on many present-day problems will depend on how the severity of the problems and the attention they receive evolve over time. If synthetic meat will make factory farming disappear, for instance, then this could lessen the value of present efforts to end factory farming.
Third, it is possible that some of the most high-value actions available today, such as actions to combat climate change, are ones that will not have any payoff until significantly in future.
Considerations related to the long-term future and new transformative technologies may be particularly decision-relevant.