All of James_Clough's Comments + Replies

I agree that the 'people counteract your action' vs 'people don't' axis and the 'systemic' vs 'atomic' axis are different - but I think that there's a strong correlation between the two. Of course any intervention could have people working to counteract it, but I think these counter-actions are much more likely for systemic-type interventions.

This is because many systemic interventions have the property that, if a large majority of people agreed the intervention was a good idea, it would be easy to accomplish, or would have already been accomplished. This ... (read more)

In many cases a big concern with systemic change is that, especially when political, it involves playing zero-sum, or negative-sum games. For example, if I think that some international legal reform X is useful, but you think it would be detrimental, we might both donate money to campaigns fighting for our side of the issue and cancel each other out, meaning the money is wasted. It would have been better for us to realise this before donating to the political campaigns and give our money elsewhere.

Note this is not the same as just saying that people might ... (read more)

1
MichaelPlant
6y
I don't think this is quite right. The distinction you seem to be drawing on is 'people counteract your action' vs 'people don't', rather that 'systemic' vs 'atomic'. An example of two atomic interventions counteracting each other would be saving lives and family planning to reduce population size; the latter want there to be generally less people, the former keep more people alive. Hence there's a natural tension there (although both could be good under certain circumstances and views). It's true we need to consider if people will counteract us. However, the scenario you suggest where it would be better us, who are for legal reform X to engage in a moral trade with those who are against us and we both agree to do something else, actually requires we could get the other side to agree. If we can't get the other side to agree to moral trade, we need to think "what is my counterfactual impact given they'll fight me" vs "what is the counterfactual impact of other stuff i could do". You're right to point out that it could be the case if you do X, people will try to make not X happen, when if you haven't tried to do X, they would have done Y instead, where Y is a positive outcome. But that could apply to both systemic and atomic interventions. I spend money saving lives, someone concerned about overpopulation could marginally step up their donations to thwart me.