I ran the Forum for three years. I'm no longer an active moderator, but I still provide advice to the team in some cases.
I'm a Communications Officer at Open Philanthropy. Before that, I worked at CEA, on the Forum and other projects. I also started Yale's student EA group, and I spend a few hours a month advising a small, un-Googleable private foundation that makes EA-adjacent donations.
Outside of EA, I play Magic: the Gathering on a semi-professional level and donate half my winnings (more than $50k in 2020) to charity.
Before my first job in EA, I was a tutor, a freelance writer, a tech support agent, and a music journalist. I blog, and keep a public list of my donations, at aarongertler.net.
...they would be less likely to hire someone on the basis of their religion because it would imply they were less good at their job.
Some feedback on this post: this part was confusing. I assume that what this person said was something like "I think a religious person would probably be harder to work with because of X", or "I think a religious person would be less likely to have trait Y", rather than "religious people are worse at jobs".
The specifics aren't very important here, since the reasons not to discriminate against people for traits unrelated to their qualifications[1] are collectively overwhelming. But the lack of specifics made me think to myself: "is that actually what they said?". It also made it hard to understand the context of your counterarguments, since there weren't any arguments to counter.
Religion can sometimes be a relevant qualification, of course; if my childhood synagogue hired a Christian rabbi, I'd have some questions. But I assume that's not what the anecdotal person was thinking about.
I don't think these things are "lumped in" with each other as often as it might seem. Within EA, people typically use "global health and development" as an umbrella term when they want to cover work in both areas; it's understandable that this would look like conflating the two.
But "global health" and "global development" are often discussed separately as well.
(Confusingly, much of the development discussion happens within the progress studies community, which overlaps heavily with EA in terms of ideas + the people involved, but has its own publications and Twitter threads and so on, which means the conversations often involve people in EA but happen outside EA spaces.)
I still wish there were more discussion of growth in EA and EA-adjacent spaces, relative to conversations about health topics, but I think the gap is less wide than it appears.
As with many statements people make about people in EA, I think you've identified something that is true about humans in general.
I think it applies less to the average person in EA than to the average human. I think people in EA are more morally scrupulous and prone to feeling guilty/insufficiently moral than the average person, and I suspect you would agree with me given other things you've written. (But let me know if that's wrong!)
I find statements of the type "sometimes we are X" to be largely uninformative when "X" is a part of human nature.
Compare "sometimes people in EA are materialistic and want to buy too many nice things for themselves; EA has a materialism problem" — I'm sure there are people in EA like this, and perhaps this condition could be a "problem" for them. But I don't think people would learn very much about EA from the aforementioned statements, because they are also true of almost every group of people.
FYI, Open Philanthropy recently regranted $40 million to the Gates Foundation's TB work, so I wouldn't say that EA "doesn't recommend" TB interventions.
However, I don't know if there are GiveWell-competitive options for individual donors in TB, or whether the people who chose the OP regrant would recommend Gates Philanthropy Partners as an option for individuals (I don't see a way to target donations to GPP more specifically, so it seems like you may just be investing in their entire portfolio, which is presumably worse than their TB-focused work on average).
I work at Open Phil, but this comment doesn't necessarily reflect Open Phil's views.
Thanks for writing this! I've been wondering about these numbers for a while, and it's nice to see that retention is higher than I feared (for such a weighty commitment w/no serious enforcement).
This is also a good reminder for me to update the last few years of my GWWC records next time I donate, since I've become part of the problem :-(
I could have created an entire second project. Instead, I spent a lot of time polishing some supplemental materials that almost no one read.
This is a perfect description of what happened when I tried to write a blog for a few years. I spent endless time worrying over minor wording choices when I could have been writing new content, sharing my content in more places, or doing any number of more productive things. I hadn't connected that failure to "perfectionism" before, but reading this post in 2016 would have been really beneficial to me.
That's exactly what I mean!
"I think religious people are less likely to have trait Y" was one form I thought that comment might have taken, and it turns out "trait Y" was "intelligence".
Now that I've heard this detail, it's easier to understand what misguided ideas were going through the speaker's mind. I'm less confused now.
"Religious people are bad at jobs" sounds to me like "chewing gum is dangerous" — my reaction is "What are you talking about? That sounds wrong, and also... huh?"
By comparison, "religious people are less intelligent" sounds to me like "chewing gum is poisonous" — it's easier to parse that statement, and compare it to my experience of the world, because it's more specific.
*****
As an aside: I spend a lot of time on Twitter. My former job was running the EA Forum. I would never assume that any group has zero members who say offensive things, including EA.