Calum

Recruiter @ GiveWell
98 karmaJoined Working (0-5 years)
varia.substack.com

Bio

Participation
3

I'm a recruiter at GiveWell. Previously, I taught high school math at a charter school in Tennessee. I learned about EA in 2014 when I stumbled on Scott Alexander's blog.

Any writing on this account is personal unless clearly stated otherwise. If I answer commented questions on job posts from GiveWell, you can assume I'm answering in my GiveWell capacity. If you'd like to see more of my personal writing, I write a Substack blog called Varia.

Posts
2

Sorted by New

Comments
4

Hi Marna, your questions (and sorry for the delay on this; I didn't see your post until today):

  • In the last two years, for junior hiring rounds across our teams, we've...
    • received in the range of thousands of applications per round. Application volume is usually somewhat lower for more senior roles.
    • made 3 to 5 hires per round. This isn't always apparent from our people page because new junior hires sometimes take on different job titles at the beginning of their employment or shortly thereafter.
  • We last hired Research Analysts in 2022, but the job duties and team placement of that role were different than the role of the same name advertised above. We last opened a Content Editor hiring round (the most similar role to this one) in 2023. From that round we made 4 hires, all of whom are on our staff today.
  • We don't plan to share information about passthrough rates for each stage of the hiring process, except to say that passthrough rates generally increase with each stage.

Hi Chinwe, yes! From the JD:

We may employ staff internationally on a case-by-case basis. We require that employees either work in time zones shared by the continental U.S. or, if working outside those time zones, be willing to work hours that are compatible with regular meetings scheduled around U.S. time zones.

We currently have many non-US employees, and we're happy to bring on more.

Bruno, it was really great to meet you at EAG!! Thank you for the work you're doing in Brazil :D 

To extend your comment about lower standards for EA criticism, I thought the remainder of Venkatasubramanian's quote was quite interesting:

"...Terminator, blah blah blah,’” Venkatasubramanian said. “I think it’s important to ask, what is the basis for these claims? What is the likelihood of these claims coming to pass? And how certain are we about all this?

The EA community has spilled heaps of words on each of those questions. It's interesting that the portrayal in the article is so off-base, because a few minutes of Googling and reading EA content could have disabused the reporter of the notion that EA has an unserious, careless bent toward long-term AI risk.

On the other hand, most popular articles about EA are negative. If you Google "effective altruism AI," the first result is this Wired article with a very negative take on EA and AI. There are a few top-level results from 80K and EA.org, but most of the remaining first-page results are skeptical or negative takes on EA.

So, it could be the case that the reporter or the outlet or both have a level of antipathy for EA that precludes due diligence? Or they could be attempting a basic due diligence but are mainly reading sources that have a very negative take on EA.

Either way, EA's public image (specifically regarding AI) is not ideal. I like your suggestion about making a greater effort to visibly signal cooperativeness!