In the day I would be reminded of those men and women,
Brave, setting up signals across vast distances,
Considering a nameless way of living, of almost unimagined values.
"and I was surprised to find I had ideas and perspectives that were unique/might not have surfaced in conversation had I not been there."
I think this is one of the reasons EAG (or other ways of informally conversing with regular EAs on EA-related things) can be extremely valuable for people. It lets you get epistemic and emotional feedback on how capable you are compared to a random EAG-sampled slice of the community. People who might have been underconfident (like you) update towards thinking they might be usefwl. That said, I think you're unusually capable, and that a lot of other people will update towards feeling like they're too dumb for EA.
But the value of increased confidence in people like you seems higher value than the possible harm caused by people whose confidence drops. And there are reasons to expect online EA material to be a lot more intimidating due to being way more filtered for high-status (incl. smart), so exposure to low-filtered informal conversations in EAG probably causes higher confidence in people who haven't had had a lot of low-filtered informal exposure yet (so if that describes you, reader, you should definitely considering going). Personally, I have a history of feeling like everything I discover and learn is just a form of "catching up" to what everyone else already knows, so talking to people about my ideas has increased my confidence a lot.
I'm really sorry I downvoted... I love the tone, I love the intention, but I worry about the message. Yes, less ambition and more love would probably make us suffer less. But I would rather try to encourage ambition by emphasising love for the ambitious failures. I'm trying to be ambitious, and I want to know that I can spiritually fall back on goodwill from the community because we all know we couldn't achieve anything without people willing to risk failing.
Some (controversial) reasons I'm surprisingly optimistic about the community:
1) It's already geographically and social-network bubbly and explores various paradigms.
2) The social status gradient is aligned with deference at the lower levels, and differentiation at the higher levels (to some extent). And as long as testimonial evidence/deference flows downwards (where they're likely to improve opinions), and the top-level tries to avoid conforming, there's a status push towards exploration and confidence in independent impressions.
3) As long as deference is mostly unidirectional (downwards in social status) there are fewer loops/information cascades (less double-counting of evidence), and epistemic bubbles are harder to form and easier to pop (from above). And social status isn't that hard to attain for conscientious smart people, I think, so smart people aren't stuck at the bottom where their opinions are under-utilised? Idk.
Probably more should go here, but I forget. The community could definitely be better, and it's worth exploring how to optimise it (any clever norms we can spread about trust functions?), so I'm not sure we disagree except you happen to look like the grumpy one because I started the chain by speaking optimistically. :3
Thanks<3
Well, I've been thinking about these things precisely in order to make top-level posts, but then my priorities shifted because I ended up thinking that the EA epistemic community was doing fine without my interventions, and all that remained in my toolkit was cool ideas that weren't necessarily usefwl. I might reconsider it. :p
Keep in mind that in my own framework, I'm an Explorer, not an Expert. Not safe to defer to.
This question is studied in veritistic social epistemology. I recommend playing around with the Laputa network epistemology simulation to get some practical model feedback to notice how it's similar and dissimilar to your model of how the real world community behaves. Here are some of my independent impressions on the topic:
Honestly, my take on the EA community is that it's surprisingly healthy. It wouldn't be terrible if EA kept doing whatever it's doing right now. I think it ranks unreasonably high in the possible ways of arranging epistemic communities. :p
I like this term for it! It's better than calling it the "Daddy-is-a-doctor problem".
Oh. It does mitigate most of the problem as far as I can tell. Good point Oo
Oh, this is wonderfwl. But to be clear, Occlumency wouldn't be the front page. It would one of several ways to sort posts when you go to /all posts. Oldie goldies is a great idea for the frontpage, though!
I have no idea how feasible it is. But I made this post because I personally would like to search for posts like that to patch the most important missing holes in my EA Forum knowledge. Thanks for all the forum work you've done, the result is already amazing! <3
FWIW, I think personal information is very relevant to giving decisions, but I also think the meme "EA is no longer funding-constrained" perhaps lacks nuance that's especially relevant for people with values or perspectives that differ substantially from major funders.
Relevant: https://forum.effectivealtruism.org/posts/GFkzLx7uKSK8zaBE3/we-need-more-nuance-regarding-funding-gaps