I didn't write about them because, as opposed to 80k:
This is a good point, sorry for getting back to it so late.
One idea I cut from the post: I think scope insensitivity means we should be suspicious of our gut intuitions in situations dealing with lots of people, so I think that’s another point in favor of accepting the RC. My main goal with this point was to suggest this central idea: “sometimes trust your ethical framework in situations where you expect your intuituon to be wrong.”
That being said, the rest of your point still stands.
From my (new since you asked this) reply to Kirmani’s comment:
I’m advocating for updating in the general direction of trusting your small-scale intuition when you notice a conflict between your large scale intuition and your small scale intuition.
Honestly, its a pretty specific argument/recommendation so I’m having trouble thinking of another example that adds something. Maybe the difference between how I feel about my dog vs farmed animals, or near vs far people. If you’d like/it would help you or someone else, I can spend some more time thinking of one.
I’m advocating for updating in the general direction of trusting your small-scale intuition when you notice a conflict between your large scale intuition and your small scale intuition.
Specifically:
In response to “Shut Up and Divide:”
I think you should be in favor of caring more (shut up and multiply) over caring less (shut up and divide) because your intuitive sense of caring evolved when your sphere of influence was small. A tribe might have at most a few hundred people, which happens to be ~where your naive intuition stops scaling linearly.
So it seems like your default behavior should be extended to your new circumstances instead of extending your new circumstances to default state.
(Although, I think SUAD might be useful for not getting trapped in...
I think you should be in favor of caring more (shut up and multiply) over caring less (shut up and divide) because your intuitive sense of caring evolved when your sphere of influence was small.
Your argument proves too much:
FWIW, this has worked for me too. I got hired this summer (college freshman) because I was impressed with + interested by some GPT-3 stuff that Peter Wildeford was doing on Twitter and wanted to try it myself. Those tweets got me hired!
TLDR: Tweet about interesting stuff and reply to people you think are smart!
My justification is pretty simple:
I like being happy and not having malaria and eating food.
I appear to be fundamentally similar to other people.
Therefore, other people probably want to be happy and not have malaria and have food to eat.
I don’t appear to be special, so my interests shouldn’t be prioritized more than my fair share.
Therefore I should help other people more than I help myself because there are more of them and they need more help.
I'd pay you $10 for it.