Yeah absolutely. There's so much noise and chaos and uncertainty in the world, I sometimes like the (arguably depressing) frame that the EA project is trying to increase your chance of doing good from 51% to 52%, and that this is totally worth fighting for, but also, just being clear on how hard it is to know the long term effects of any action.
Hmm I'm not sure if I have a very considered answer to this question, except for the main argument that I think it's much harder for people to see animals as having rights/moral value since they look different, are different species, and often act in foreign ways that make us more likely to discount their capacity to feel and think (e.g. fish don't talk, scream, or visibly emote).
On some level I think the answer is always the same, regardless of the headwinds or tailwinds: you do what you can with your limited resources to improve the world as much as you can. In some sense I think slowing the growth of factory farming in a world where it was growing is the same as a world where it is stagnant and we reduce the number of animals raised. In both worlds there's a reduction in suffering. I wrote a creative piece on this exact topic here if that is at all appealing.
I also think on the front of factory farming we focus too much on the entire problem, and not enough on how good the wins are in and of themselves.
Hi Sam, I'm finding it hard to respond to your request because IMO the scenarios are too vague. To use your basketball metaphor, a specific player is something that I can integrate meaningfully into a prediction, but executing the strategy flawlessly is much more nebulous. Do you have specific ideas in mind of what scenario 3 might look like? How much increased funding is there? I think to make a good conditional prediction it would need to be something we could clearly decide whether or not we achieved it? Raised an extra $50m for the movement has a clear yes/no, whereas "achieve maximum coordination and efficiency" seems very subjective to me.
Thanks for the answers. Sounds like a big crux for us is that I am sadly much more cynical about (a) how much optimism can shift probabilities. I think it can make a difference, but I don't think it can change probabilities from 10% to 70%. And (b) I am just much more cynical on our chances of ending factory farming by 2060. I'd probably put the number at around 1-5%.
One thing that strikes me as interesting when I think about my own experience and my impression of the people around me is that it can be hard to tell what my own reasons are when I might distance myself from EA. I might describe myself as EA adjacent and this could be some combination of:
And as humans often do, I might just tell myself a story that is more flattering than what is actually happening. I might tell myself that this is a very strategic choice to persuade this person to care about AI Safety, or for my long-term career prospects, or to protect my organisation from future scandals, and EA being a low(ish) status in some circles right now might be doing the heavy lifting.