Mini EA Forum Update
We’ve updated our new user onboarding flow! You can see more details in GitHub here.
In addition to making it way prettier, we’re trying out adding some optional steps, including:
1. You can select topics you’re interested in, to make your frontpage more relevant to you.
1. You can also click the “Customize feed” button on the frontpage - see details here.
2. You can choose some authors to subscribe to. You will be notified when an author you are subscribed to publishes a post.
1. You can also subscribe from any user’s profile page.
3. You’re prompted to fill in some profile information to give other users context on who you are.
1. You can also edit your profile here.
I hope that these additional optional steps help new users get more out of the Forum. We will continue to iterate on this flow based on usage and feedback - feel free to reply to this quick take with your thoughts!
My biggest takeaway from EA so far has been that the difference in expected moral value between the consensus choice and its alternative(s) can be vastly larger than I had previously thought.
I used to think that "common sense" would get me far when it came to moral choices. I even thought that the difference in expected moral value between the "common sense" choice and any alternatives was negligible, so much so that I made a deliberate decision not to invest time into thinking about my own values or ethics.
EA radically changed my opinion, and now I hold the view that the consensus view is frequently wrong, even when the stakes are high, and that is possible to make dramatically better moral decisions by approaching them with rationality, and a better-informed ethical framework.
Sometimes I come across people who are familiar with EA ideas but don't particularly engage with them or the community. I often feel surprised, and I think the above is a big part of why. Perhaps more emphasis could be placed on this expected moral value gap in EA outreach?
My overall impression is that the CEA community health team (CHT from now on) are well intentioned but sometimes understaffed and other times downright incompetent. It's hard to me to be impartial here, and I understand that their failures are more salient to me than their successes. Yet I endorse the need for change, at the very least including 1) removing people from the CHT that serve as a advisors to any EA funds or have other conflict of interest positions, 2) hiring HR and mental health specialists with credentials, 3) publicly clarifying their role and mandate.
My impression is that the most valuable function that the CHT provides is as support of community building teams across the world, from advising community builders to preventing problematic community builders from receiving support. If this is the case, I think it would be best to rebrand the CHT as a CEA HR department, and for CEA to properly hire the community builders who are now supported as grantees, which one could argue is an employee misclassification.
I would not be comfortable discussing these issues openly out of concern for the people affected, but here are some horror stories:
1. A CHT staff pressured a community builder to put through with and include a community member with whom they weren't comfortable interacting.
2. A CHT staff pressured a community builder to not press charges against a community member who they felt harassed by.
3. After a restraining order was set by the police in place in this last case, the CHT refused to liaison with the EA Global team to deny access to the person restrained, even knowing that the affected community builder would be attending the event.
4. My overall sense is that CHT is not very mindful of the needs of community builders in other contexts. Two very promising professionals I've mentored have dissociated from EA, and rejected a grant, in large part because of how they were treated by the CHT.
5. My impression is that the CHT staff underm
GET AMBITIOUS SLOWLY
Most approaches to increasing agency and ambition focus on telling people to dream big and not be intimidated by large projects. I'm sure that works for some people, but it feels really flat for me, and I consider myself one of the lucky ones. The worst case scenario is big inspiring speeches get you really pumped up to Solve Big Problems but you lack the tools to meaningfully follow up.
Faced with big dreams but unclear ability to enact them, people have a few options.
* try anyway and fail badly, probably too badly for it to even be an educational failure.
* fake it, probably without knowing they're doing so
* learned helplessness, possible systemic depression
* be heading towards failure, but too many people are counting on you so someone steps in and rescue you. They consider this net negative and prefer the world where you'd never started to the one where they had to rescue you.
* discover more skills than they knew. feel great, accomplish great things, learn a lot.
The first three are all very costly, especially if you repeat the cycle a few times.
My preferred version is ambition snowball or "get ambitious slowly". Pick something big enough to feel challenging but not much more, accomplish it, and then use the skills and confidence you learn to tackle a marginally bigger challenge. This takes longer than immediately going for the brass ring and succeeding on the first try, but I claim it is ultimately faster and has higher EV than repeated failures.
I claim EA's emphasis on doing The Most Important Thing pushed people into premature ambition and everyone is poorer for it. Certainly I would have been better off hearing this 10 years ago
What size of challenge is the right size? I've thought about this a lot and don't have a great answer. You can see how things feel in your gut, or compare to past projects. My few rules:
* stick to problems where failure will at least be informative. If you can't track reality well eno
People often propose HR departments as antidotes to some of the harm that's done by inappropriate working practices in EA. The usual response is that small organisations often have quite informal HR arrangements even outside of EA, which does seem kinda true.
Another response is that it sometimes seems like people have an overly rosy picture of HR departments. If your corporate culture sucks then your HR department will defend and uphold your sucky corporate culture. Abusive employers will use their HR departments as an instrument of their abuse.
Perhaps the idea is to bring more mainstream HR practices or expertise into EA employers, rather than merely going through the motions of creating the department. But I think mainstream HR comes primarily from the private sector and is primarily about protecting the employer, often against the employee. They often cast themselves in a role of being there to help you, but a common piece of folk wisdom is "HR is not your friend". I think frankly that a lot of mainstream HR culture is at worst dishonest and manipulative, and I'd be really sad to see us uncritically importing more of that.
1. CEA has a public dashboard which contains metrics on most of our projects!
2. We’ve just launched a few updates:
1. The EA Forum, EAG, and EAGx data now refresh daily.
1. This is most relevant for the Forum data, to help answer questions like “what is happening with the Forum ~right now”.
2. All other sections are updated on a ~monthly cadence.
2. We’ve added data on our community building grants program to the Groups section.
Feedback is appreciated as always!
Thank you! And a few reflections on recognition.
A few days ago, while I sat at the desk in my summer cabin, an unexpected storm swept in. It was a really bad storm, and when it subsided, a big tree had fallen, blocking the road to the little neighborhood where the cabin lies. Some of my neighbors, who are quite senior, needed to get past the tree and could not move it, so I decided to help. I went out with a chainsaw and quad bike, and soon the road was clear.
The entire exercise took me about two hours, and it was an overall pretty pleasurable experience, getting a break from work and being out in nature working with my body. However, afterward, I was flooded with gratitude, as if I had done something truly praiseworthy. Several neighbors came to thank me, telling me what a very nice young man I was, some even brought small gifts, and I heard people talking about what I had done for days afterward.
This got me thinking.
My first thought: These are very nice people, and it is obviously kind of them to come and thank me. But it seems a little off - when I tell them what I do every day, what I dedicate my life to, most of them nod politely and move on to talk about the weather. It seems bad and unfair that when we do something immediately visible and easy to grasp, recognition and gratitude come pouring in, but when we engage in work that is indirect, more abstract, and potentially with far-reaching consequences, the acknowledgment isn’t as forthcoming.
My second thought: But wait a minute. Here I am, sitting brooding over the behavior of others. Am I any better? What have I done to express gratitude to all of the amazing people out there in the world working on what they think is the most important thing without getting any recognition? Not much.
My third thought: I should do something about this.
To all of you, from the bottom of my heart - thanks! The tasks you dedicate yourselves to might not garner instant applause or make the evening news. You might n
Load more (8/68)
Application forms for EA jobs often give an estimate for how long you should expect it to take; often these estimates are *wildly* too low ime. (And others I know have said this too). This is bad because it makes the estimates unhelpful for planning, and because it probably makes people feel bad about themselves, or worry that they're unusually slow, when they take longer than the estimate.
Imo, if something involves any sort of writing from scratch, you should expect applicants to take at least an hour, and possibly more. (For context, I've seen application forms which say 'this application should take 10 minutes' and more commonly ones estimating 20 minutes or 30 minutes).
It doesn’t take long to type 300 words if you already know what you’re going to say and don’t particularly care about polish (I wrote this post in less than an hour probably). But job application questions —even ‘basic’ ones like ‘why do you want this job?’ and ‘why would you be a good fit?’-- take more time. You may feel intuitively that you’d be a good fit for the job, but take a while to articulate why. You have to think about how your skills might help with the job, perhaps cross-referencing with the job description. And you have to express everything in appropriately-formal and clear language.
Job applications are also very high-stakes, and many people find them difficult or ‘ugh-y’, which means applicants are likely to take longer to do them than they “should”, due to being stuck or procrastinating.
Maybe hirers put these time estimates because they don’t want applicants to spend too long on the first-stage form (for most of them, it won’t pay off, after all!) This respect for people’s time is laudable. But if someone really wants the job, they *will* feel motivated to put effort into the application form.
There’s a kind of coordination problem here too. Let's imagine there's an application for a job that I really want, and on the form it says 'this application should take you appr