Problem areas beyond 80,000 Hours' current priorities mentions "Broadly promoting positive values".
I have some some questions:
What are the values that are needed to further EA's interests?
Where (in which cultures or areas of culture at large) are they deficient, or where might they become deficient in the future?
Problem areas... mentions "altruism" and "concern for other sentient beings". Maybe those are the two that EA is most essentially concerned with. If so, what are the support values needed for maximizing those values?
This is an interesting question.
One possible value is something like intrinsically valuing Truth or Better Reasoning. Perhaps also something like Productivity/Maximisation. The rationality community is perhaps a good example of promoting such values (explicitly here).
It feels somewhat double-edged to promote instrumental values. This can cause all types of troubles if it's misinterpreted or too successful.
What do you think are the important values?
I'm basically an outsider to EA, but "from afar", I would guess that some of the values of EA are 1) against politicization, 2) for working and building rather than fighting and exposing ("exposing" being "saying the unhealthy truth for truth's sake", I guess), 3) for knowing and self-improvement (your point), 4) concern for effectiveness (Gordon's point). And of course, the value of altruism.
These seem like they are relatively safe to promote (unless I'm missing something.)
Altruism is composed of 1) other... (read more)