It'd be cool if the forum had a commenting feature similar to Google Docs, where comments and subcomments are attached directly to sentences in the post. Readers would then be able to opt in to see the discussion for each point on the side while reading the main post. Users could also choose to hide the feature to reduce distractions.
For comments that directly respond to particular points in the post, this feature would be more efficient (for reading and writing) relative to the current standard since they don't have to spend words specifying what exactly they're responding to.
Forum suggestion: Option to publish your post as "anonymous" or blank, that then reverts to reveal your real forum name in a week.This would be an opt-in feature that lets new and old authors gain less biased feedback on their posts, and lets readers read the posts with less of a bias from how they feel about the author.
At the moment, information cascades amplify the number of votes established authors get based on their reputation. This has both good (readers are more likely to read good posts) and bad (readers are less likely to read unusual perspectives, and good newbie authors have a harder time getting rewarded for their work) consequences. The anonymous posting feature would redistribute the benefits of cascades more evenly.
I don't think the net benefit is obvious in this case, but it could be worth exploring and testing.
This is a feature we've been considering for a while! Thanks for sharing the idea and getting some upvotes as additional evidence.
I can't promise this will show up at any particular time, but it is a matter of active discussion, for the reasons you outlined and to give people a third option alongside publishing something on their own account and publishing on a second, pseudonymous account.
Correct me if I'm wrong, but I think in Christianity, there's a lot of respect and positive affect for the "ordinary believer". Christians who identify as "ordinary Christians" feel good about themselves for that fact. You don't have to be among the brightest stars of the community in order to feel like you belong.
I think in EA, we're extremely kind, but we somehow have less of this. Like, unless you have 2 PhD's by the age of 25 and you're able to hold your own in a conversation about AI-alignment theory with the top researchers in the world... you sadly have to "settle" for menial labour with impact hardly worth talking about. I'm overstating it, of course, but am I wrong?
I'm not saying ambition is bad. I think shooting for the stars is a great way to learn your limits. But I also notice a lot of people suffering under intellectual pressure, and I think we could collectively be more effective (and just feel better) if we had more... room for "ordinary folk dignity"?
My experience as a non-PhD who dropped out of EA things for two years before returning is that I felt welcome and accepted when I started showing up in EA spaces again. And now that I've been at CEA for three years, I still spend a lot of my time talking to and helping out people who are just getting started and don't have any great credentials or accomplishments; I hope that I'm not putting pressure on them when I do this.
That said, every person's experience is unique, and some people have certainly felt this kind of pressure, whether self-imposed as a result of perceived community norms or thrust upon them by people who were rude or dismissive at some point. And that's clearly awful — people shouldn't be made to feel this way in general, and it's especially galling to hear about it sometimes happening within EA.
My impression is that few of these rude or dismissive people are themselves highly invested in the community, but my impression may be skewed by the relationships I've built with various highly invested people in the job I now have.
Lots of people with pretty normal backgrounds have clearly had enormous impact (too many examples to list!). And within the EA spaces I frequent, there's a lot of interest and excitement about people sharing their stories of joining the movement, even if those people don't have any special credentials. The most prominent example of this might be Giving What We Can.
I don't understand the "menial labor" point; the most common jobs for people in the broader EA community are very white-collar (programmers, lawyers, teachers...) What did you mean by that?
Personally, the way I view "ordinary folk dignity" in EA is through something I call "the airplane test". If I sat next to someone on an airplane and saw them reading Doing Good Better, and they seemed excited about EA when I talked to them, I'd be very happy to have met them, even if they didn't have any special ambitions beyond finding a good charity and making occasional donations. There aren't many people in the world who share our unusual collection of values; every new person is precious.
Nono, I'm not trying to point to a problem of EAs trying to make others feel unwelcome or dumb. I think EA is extremely kind, and almost universally tries hard to make people feel welcome. I'm just pointing to the existence of an unusually strong intellectual pressure, perhaps combined with lots of focus on world-saving heroes and talk about "what should talented people do?"
I think ambition is good, but I think we can find ways of encouraging ambition while also mitigating at least some of the debilitating intelligence-dysphoria many in our community suffer from.
I'm writing this in reaction to talking to three of my friends who suffer under the intellectual pressure they feel. (Note that the following are all about the intellectual pressure they get from EA, and not just in general due to academic life.)Friend1: "EA makes me feel real dumb XD i think i feel out of place by being less intelligent"
Friend2: "I’m not worried that I’m not smart, but I am worried that I am not smart enough to meet a certain threshold that is required for me to do the things I want to do. ... I think I have very low odds of achieving things I deeply want to achieve. I think that is at least partially responsible for me being as extremely uncomfortable about my intelligence as I am, and not being able to snap out of it."
Me: "Do you ever refrain from trying to contribute intellectually because you worry about taking up more attention than it's worth?"
Friend3: "hmm, not really for that reason. because I'm afraid my contribution will be wrong or make me look stupid. wrong in a way that reflects negatively on me-- stupid errors, revealing intellectual or character weakness.
Some of this is a natural and unavoidable result of the large focus EA places on intellectual labour, but I think it's worse than it needs to be. I think some effort to instil some "ordinary EA dignity" into our culture wouldn't hurt. I might have a skewed sample, however.
And to respond to your question about what I meant by "menial labour". I was being poetic. I just mean that I feel like EA places a lot of focus on the very most high-status jobs, and I've heard friends despairing for having to "settle" for anything less. I sense that this type of writing might not be the norm for EA shortform, but I wasn't sure.
(I no longer endorse this post.)
A way of reframing the idea of "we are no longer funding-constrained" is "we are bottlenecked by people who can find new cost-effective opportunities to spend money". If this is true, we should plausibly stop donating to funds that can't give out money fast enough anyway, and rather spend money on orgs/people/causes you personally estimate needs more money now. Maybe we should up-adjust how relevant we think personal information is to our altruistic spending decisions.
Is this right? And are there any good public summaries of the collective wisdom fund managers have acquired over the years? If we're bottlenecked by people who can find new giving opportunities, it would be great to promote the related skills. And I want to read them.
FWIW, I think personal information is very relevant to giving decisions, but I also think the meme "EA is no longer funding-constrained" perhaps lacks nuance that's especially relevant for people with values or perspectives that differ substantially from major funders.
Hey, I really like this re-framing! I'm not sure what you meant to say in the second and third sentences tho :/