Problem areas beyond 80,000 Hours' current priorities mentions "Broadly promoting positive values".
I have some some questions:
What are the values that are needed to further EA's interests?
Where (in which cultures or areas of culture at large) are they deficient, or where might they become deficient in the future?
Problem areas... mentions "altruism" and "concern for other sentient beings". Maybe those are the two that EA is most essentially concerned with. If so, what are the support values needed for maximizing those values?
I think a healthy dose of moral uncertainty (and normative uncertainty in general) is really important to have, because it seems pretty easy for any ethical/social movement to become fanatical or to incur a radical element, and end up doing damage to itself, its members, or society at large. ("The road to hell is paved with good intentions" and all that.)
A large part of what I found attractive about EA is that its leaders emphasize normative uncertainty so much in their writings (starting with Nick Bostrom back in 2009), but perhaps it's not "proselytized" as much as it should be day-to-day.