AI safety researcher
Can you explain what you mean by "contextualizing more"? (What a curiously recursive question...)
I mean it in this sense; making people think you're not part of the outgroup and don't have objectionable beliefs related to the ones you actually hold, in whatever way is sensible and honest.
Maybe LW is better at using disagreement button as I find it's pretty common for unpopular opinions to get lots of upvotes and disagree votes. One could use the API to see if the correlations are different there.
IMO the real answer is that veganism is not an essential part of EA philosophy, just happens to be correlated with it due to the large number of people in animal advocacy. Most EA vegans and non-vegans think that their diet is a small portion of their impact compared to their career, and it's not even close! Every time you spend an extra $5 finding a restaurant with a vegan option you could help 5,000 shrimp instead. Vegans have other reasons like non-consequentialist ethics, virtue signaling or self-signaling, or just a desire not to eat the actual flesh/body fluids of tortured animals.
If you have a similar emotional reaction to other products it seems completely valid to boycott them, although as you mention there can be significant practical burdens, both in adjusting one's lifestyle to avoid such products and in judging whether the claims of marginal impact are valid. Being vegan is not obligatory in my culture and neither should boycotts be-- unless the marginal impact of the boycott is larger than any other life choice which is essentially never true.
I really enjoyed reading this post; thanks for writing it. I think it's important to take space colonization seriously and shift into "near mode" given that, as you say, the first entity to start a Dyson Swarm has a high chance to get DSA if it isn't already decided by AGI, and it's probably only 10-35 years away.
Assorted thoughts
When 80,000 Hours pivoted to AI, I largely stopped listening to the podcast, thinking that as part of the industry I would already know everything. But I recently found myself driving a lot and consuming more audio content, and the recent ones eg with Holden, Daniel K and ASB are incredibly high quality and contain highly nontrivial, grounded opinions. If they keep this up I will probably keep listening until the end times.
What inspiring and practical examples!
Maybe a commitment to impact causes EA parents to cooperate at maximizing it, which means optimally distributing the parenting workload whatever society thinks. In EA with lots of conferences and hardworking impactful women, it makes sense that the man's op cost is often lower. Elsewhere couples cooperate to maximize income, but men tend to have higher earning potential so maybe the woman would often do more childcare anyway.
My sense is that parenting falls on the woman due not only to gender norms, but also higher average interest in childcare and other confounders-- so I wonder how much is caused by other effects like EAs leaning liberal, questioning social expectations in general, or EA dads somehow being more keen on parenting. Also it's unclear if EA men even contribute more than non-EA men.
I'm reminded a bit of the gender equality paradox where in the USSR, and maybe also countries with restrictive gender roles [1] there are higher rates of women in STEM and other male-dominated fields. The idea is that in liberal societies, there would be a disparity due to difference in interest, and some kinds of external factor can reduce disparities on net-- in the Soviet case because equality was enforced by the state, in other cases if there is economic interest or a lack of Western stereotypes. So EA mindset is maybe one of these external factors-- not to imply it's like Soviet central planning or anything.
[1] the research seems disputed here
I think the "most topics" thing is ambiguous. There are some topics on which mainstream experts tend to be correct and some on which they're wrong, and although expertise is valuable on topics experts think about, they might be wrong on most topics central to EA. [1]
In the real world, assuming we have more than five minutes to think about a question, we shouldn't "defer" to experts or immediately "embrace contrarian views", rather use their expertise and reject it when appropriate. Since this wasn't an option in the poll, my guess is many respondents just wrote how much they like being contrarian, and EAs have to often be contrarian on topics they think about so it came out in favor of contrarianism.
[1] Experts can be wrong because they don't think in probabilities, they have a lack of imagination, there are obvious political incentives to say one thing over another, and probably other reasons, and lots of the central EA questions don't have actual well-developed scientific fields around them, so many of the "experts" aren't people who have thought about similar questions in a truth-seeking way for many years