Community
Community
Posts about the EA community and projects that focus on the EA community

Quick takes

5
2d
GWWC Anniversary Week Update: We're in the middle of celebrating Giving What We Can and the 10% Pledge's 15th anniversary! Thanks to everyone who has posted their thoughts, pledge stories, or hopes for the future on social so far and/or contributed to our EA Forum thread. We've also been posting pledge-focused content on our blog all week (and a bit before) and wanted to highlight a couple great posts to check out: -The "Progressive Pledge" by Phillip Popien and Alana HF (a unique way to gradually increase your pledge percentage that takes into account decreasing marginal utility of money) -The Virtues of Virtue Signaling by Martin Jacobson (an in-depth look at public giving — why it's sometimes difficult or discouraged, and why maybe it shouldn't be) Our Effective Giving Global Coordinator and Incubator Luke Moore also posted a great piece on how Peter Singer's ideas transformed his life! Of course, don't let these more in-depth examples dissuade you from posting your quick thoughts on what the Pledge has meant to you — even just a few sentences is great! :) Can't wait to share the compilation of anniversary week posts and thoughts at the end of the week!
0
3d
Learning about EA has revealed within me two distinctly different drives, both to achieve the same outcome: belonging.  On one hand I want to share thoughts, hunches and instincts based on little more than experience in attempts to start discussions and hear others thoughts. On the other hand I want my thoughts to be at least logical or rational enough that their sharing lowers friction for those receiving.  When trying to write for the EA forums it feels like I'm hosting a party for guests whose expectations I'm unfamiliar with.  I don't want to out myself as not belonging, but I have to risk that in order to a) improve my thoughts and b) find out better where I belong.  The desire to belong within EA seems like a me problem, instinct tells me it's less EAs job to make me feel welcome than it is my job to know myself with more clarity (and thus have more confidence in the value of hunches and instincts even if they do get downvoted to oblivion as I fear they might). 
6
7d
Is there a maximum effective membership size for EA? @Joey 🔸 spoke at EAGx last night and one of my biggest take-aways was the (controversial maybe) take that more projects should decline money.  This resonates with my experience; constraint is a powerful driver of creativity and with less constraint you do not necessarily create more creativity (or positive output).  Does the EA movement in terms of number of people have a similar dynamic within society? What growth rate is optimal for a group of members to expand, before it becomes sub-optimal? Zillions of factors to consider of course but... something maybe fun to ponder. 
2
7d
Compassion fatigue should be focused on less.  I had it hammered into me during training as a crisis supporter and I still burnt out.  Now I train others, have seen it hammered into them and still watch countless of them burn out.  I think we need to switch at least 60% of compassion fatigue focus to compassion satisfaction.  Compassion satisfaction is the warm feeling you receive when you give something meaningful to someone, if you're 'doing good work' I think that feeling (and its absence) ought to be spoken about much more. 
3
12d
People in EA end up optimizing for EA credentials so they can virtue signal to grantmakers, but grantmakers would probably like people to scope out non-EA opportunities because that allows us to introduce unknown people to the concerns we have
8
16d
What is malevolence? On the nature, measurement, and distribution of dark traits was posted two weeks ago (and i recommend it). there was a questionnaire discussed in that post which tries to measure the levels of 'dark traits' in the respondent. i'm curious about the results[1] of EAs[2] on that questionnaire, if anyone wants to volunteer theirs. there are short and long versions (16 and 70 questions). 1. ^ (or responses to the questions themselves) 2. ^ i also posted the same quick take to LessWrong, asking about rationalists
10
16d
2
I think eventually, working on changing the EA introductory program is important. I think it is an extremely good thing to do well, and I think it could be improved. I'm running a 6 week version right now, and I'll see if I feel the same way at the end.
23
19d
7
I think that EA outreach can be net positive in a lot of circumstances, but there is one version of it that always makes me cringe. That version is the targeting of really young people (for this quicktake, I will say anyone under 20). This would basically include any high school targeting and most early-stage college targeting. I think I do not like it for two reasons: 1) it feels a bit like targeting the young/naive in a way I wish we would not have to do, given the quality of our ideas, and 2) these folks are typically far from making a real impact, and there is lots of time for them to lose interest or get lost along the way. Interestingly, this stands in contrast to my personal experience—I found EA when I was in my early 20s and would have benefited significantly from hearing about it in my teenage years.
Load more (8/165)

Posts in this space are about

CommunityEffective altruism lifestyle