Vulnerable EAs also want to follow only good norms while disposing of the bad ones!
If you offer people the heuristic "figure out if it's reasonable and only obey it if it is" then often they will fail.
You mention clear-cut examples, but oftentimes they will be very grey, or they will seem grey while being inside them. There may be several strong arguments why the norm isn't a good one; the bad actor will be earnest, apologetic, and trying to let you have your norm even though they don't believe in it. They may seem like a nice reasonable person trying to do the right thing in an awkward situation.
Following every norm would be quite bad. Socially enforced gendered cosmetics are disgusting and polyamory is pretty nifty.
Nonetheless, we must recognize that the same process that produces "polyamory is pretty nifty" will also produce in many people: "there's no reason I can't have a friendly relationship with my employer rather than an adversarial one" (these are the words they will use to describe the situation while living in their employer's house) and "I can date my boss if we are both ethical about it."
We must not look down on these people as though we'd never fall for it - everyone has things they'd fall for, no matter how smart they are.
My suggestion is to outsource. Google your situation. Read reddit threads. Talk to friends, DM people who have the same job as you (and who you are certain have zero connection to your boss) - chances are they'll be happy to talk to someone in the same position.
A few asides, noting that these are basics and noncomplete.
I think censorship would be a bad choice here, because the EA forum hasn't discussed these concepts previously (in any routine way, I'm sure there is a screed or two that could be dug up from a mound of downvotes) and is unlikely to in the future.
I would agree that race/IQ debates on the EA forum are unlikely to produce anything of value. But it's my experience that if you have free discussion rights and one banned topic, that causes more issues than just letting people say their piece and move on.
I'd also agree that EA isn't meant to be a social club for autists - but from a cynical perspective, the blithely curious and alien-brained are also a strategic resource and snubbing them should be avoided when possible.
If people are still sharing takes on race/IQ two weeks from now, I think that would be a measurable enough detraction from the goal of the forum to support the admins telling them to take it elsewhere. But I would be surprised if it were an issue.
The first statement would be viewed positively by most, the second would get a raised eyebrow and a "And what of it?", the third is on thin fucking ice, and the fourth is utterly unspeakable.
2-4 aren't all that different in terms of fact-statements, except that IQ ≠ intelligence, so some accuracy is lost moving to the last. It's just that the first makes it clear which side the speaker is on, the second states an empiricism and the next two look like they're... attacking black people, I think?
I would consider the fourth a harmful gloss - but it doesn't state that there is a genetic component to IQ, that's only in the reader's eye. This makes sense in the context of Bostrom posing outrageous but Arguably Technically True things to inflame the reader's eye.
I think people would be mad at this, because they feel like poor people are being attacked and want to defend them. They would think, 'Oh, you're saying that rich people got there by being so smart and industrious, and if some single mom dies of a heart attack at 30 it's a skill issue.' But no one said that.
And this would be uncontested.
If someone says that, you'd probably assume they were pushing an antivax agenda and raise an eyebrow, even if they can produce a legitimate study showing that. (I don't think there is, I made up that example.) So I am sympathetic to being worried about agenda-pushing that is just saying selectively true statements.
Man, this shit is exhausting. Maybe CEA has the right idea here: they disavow the man's words without disavowing the man and then go back to their day.
I worry that most people here don't have timelines, just vibes.
And when AI does something scary, they go, "Look, I was espousing doomy vibes and then AI did something that looks doomy! Therefore I am worth paying more attention to!"
Or, "Hm, I was more into global development but the vibes are AI now. Maybe I should pull my old doomist uniform out of the closet."
If that sounds like something you're doing reader, maybe reconsider?
I feel like this is pretty important. I think this is basically fine if it's a billionaire who thinks CEA needs real estate, and less fine if it is incestuous funding from another EA group.
Guys, what did we just learn about deontological rules? In this case honesty.
"Do what you think is ethical, don't be constantly changing your behavior for PR concerns" is going to work out to better PR.
As opposed to speaking with Congressmen, is "prepare a scientific report and meet with the NIH director/his advisors" an at-all plausible mechanism for shutting down the specific research grant Soares linked?
Or if not, becoming NIH peer reviewers?
And—despite valiant effort!—we've been able to do approximately nothing.
I apologize for an amateur question but: what all have we tried and why has it failed?