Wiki Contributions


Objections to Value-Alignment between Effective Altruists

I doubt organisations would attend the forums if it would not influence their decision making afterwards. It is exactly the type of meeting which I would love to see more transparency around.

Objections to Value-Alignment between Effective Altruists

I think I should have stated more clearly that I don't see these tendencies as abnormal. I see them as maladaptive given the goal EA has. When thinking about the question of whether fandom is a good feature for epistemic health, I don't care too much about whether fandom tendencies exists in other communities. I know that it's the norm (same with hierarchy and homogeneity).

It can be quite effective to have such a community strucutre in situations in which you want to change the minds of many people quickly. You can now simply try change the mind of the one who others look up to (e.g. Toby Ord/ Y. Bengio) and expect other members will likely follow (models in 'misinformation age' by C. O'Connor & J Weatherall). A process of belief formation which does not use central stars will converge less quickly I imagine, but I'd have to look into that. This is the kind of research which I hope this article makes palatable to EAs.

My guess is there is not only a sweet spot of cog. diversity but also a sweet spot of how much a community should respect their central stars. Too much reverence and you loose feedback mechanisms. Too little and belief formation will be slow and confused and you lose the reward mechanism of reputation. I expect that there will always be individuals who deserve more respect and admiration than others in any community, because they have done more or better work on behalf of everyone else. But I would love for EAs to examine where the effective sweet spot lies and how one can influence the level of fandom culture (e.g. Will's recent podcast episode on 80k was doing a good job I think) so that the end result is a healthy epistemic community.

Objections to Value-Alignment between Effective Altruists

Such a great comment, I agree with most you say, thank you for writing this up. Curious about a formal mechanism of communcal belief formation/belief dissemination. How could this look like? Would this be net good in comparision to baseline?

Objections to Value-Alignment between Effective Altruists

1. I never spoke specifically of corporate advocates, so despite the fact that I agree with you that other motives are often at play, my point here was that one reason some advocates support traditional diversity is because they have reason to believe it tracks different views on the world. That's neither mutually exclusive with the reasons you outline nor is this article about corporate motivation.

2. As you cite I state this list is 'non-exhaustive'. If the prominent EAs who are not on this list agree that reverence is not good for a community's epistemic health, then they should not even want to be on the list. After publishing this article I was also notified of prominant female EAs who could have maybe made this list, but since I only listed individuals who I experienced directly as being talked about in a revered manner, they are not listed. My experience won't generalise to all experiences. My two points here are: there are revered individuals and they are mostly male. I agree there are likely a few revered women, but I would be surprised if they are numerous enough to balance out the male bias.

3. Fair point. I find it hard to tell how much things have changed and simply wanted to point out some evidence I found in writing.