The idea of “value drift” strikes me as based on a naivety that people have altruistic values.
I don’t think that’s how people work. I think people follow their local incentives, and a line can always be traced from their actions to some kind of personal benefit.
This is not a cynical idea. It is a tremendously hopeful idea. It all adds up normality. The fact that we see people act altruistically all the time means that it is possible to align selfish interests with the “good”. that our environments are already shaped in such a way. It suggests that good outcomes are simply a matter of creating and upholding the right incentive structures. Heaven will follow by itself.
So given that people follow their selfish incentives, why do they drift away from the EA community?
Here’s another idea: that motivation is always relative. People aren’t as propelled towards something as its objective value. They’re as motivated as the badness of the next best alternative. If your plan B becomes better, your plan A is suddenly not as interesting anymore, even if it's "objective value" is still the same. People might try to explain their behavior, lamenting that they've changed. Maybe they just got better options.
How does this relate to value drift? The naive model might be that you have some variables in your head. Each one of them gives some numeric value to some virtuous and lofty good, like “global health” and “the long term future” and “freedom of speech” and whatnot.
I’d like to propose that the model is more like this: the variables are there, but they don’t point to virtuous and lofty goods. They point to things about you. Power. Survival. Prestige. Odds of procreation. The values are highly stable, and motivation only really changes as the environment does.
And there’s absolutely positively nothing bad about this whatsoever. It all adds up to normality. In fact these “degenerate” values even add up to something as noble as EA. Greed is good, as long as it’s properly channeled.
Value drift isn’t some kind of unexplainable shift in attitude. It’s a shift in perceived incentives. Correctly perceived or not.
One might have gotten involved in EA on the heuristic that maximizing impact will lead to the highest prestige. They might have learned in EA that this heuristic doesn’t always work. Maybe they weren’t praised enough. Maybe they found that there was too much competition to be noticed. Maybe they found that the world doesn’t necessarily reward good intentions, so they dropped them.
Maybe they left because there wasn’t much more to learn. Maybe they left because they felt threatened by weird political ideas. Maybe they felt censored. Maybe they found a different social environment that was much more rewarding. Maybe they felt like the establishment of EA wasn’t taking them seriously.
I don’t have a simple answer, but the current concept of “value drift” is very much a pet peeve. We have to account for people’s selfish sides. As long as we’re in denial about that, we won’t get as much of the altruistic side.
So I propose we call it incentive drift instead.