Aaron Gertler

I moderate the Forum, and I'd be happy to review your next post.

I'm a full-time content writer at CEA. I started Yale's student EA group, and I've also volunteered for CFAR and MIRI. I spend a few hours a month advising a small, un-Googleable private foundation that makes EA-adjacent donations. I also play Magic: the Gathering on a semi-professional level and donate half my winnings (more than $50k in 2020) to charity.

Before joining CEA, I was a tutor, a freelance writer, a tech support agent, and a music journalist. I blog, and keep a public list of my donations, at aarongertler.net.

Sequences

The Farm Animal Welfare Newsletter
Replacing Guilt
Part 8: Putting it into Practice
Part 7: Increasing the Accuracy of Our Judgments
Part 6: Emerging Technologies
Part 5: Existential Risk
Part 4: Longtermism
Part 3: Expanding Our Compassion
Part 2: Differences in Impact
Load More (9/11)

Comments

JJ Hepburn's Shortform

I thought this was a great Shortform post!

  • The book for hiring managers I've seen referenced most often is Who. If you're not sure what "resources" to look at, that's probably a good starting point.
  • "Apply too often rather than not often enough": I often tell people this, because:
    • Some people tend to underestimate their qualifications or suitability. You might be one of them!
    • (What JJ said about getting practice)
    • Even if you don't get the job, you might get a referral to other jobs if you do well during the process (I was hired this way, and I've helped at least one other person get hired this way)
    • EA-aligned orgs are generally quite open to feedback; if you find a specific process confusing or overly time-consuming, you can tell the org this; I think they'll be much more likely than most orgs to make changes in response (improving the experience for other applicants)
  • I'm not sure about the "application drafting" approach, but I recommend something similar: If a job interests you, look at an org's website (or LinkedIn) to find people who have that job or similar jobs. Look at what they did earlier in their careers. Consider sending a brief, polite email with a question or two, or asking for a quick call. Sometimes, people will just give you great advice for free. 
    • And even if no one responds, you've still gotten a much better sense for how these career paths operate in the real world (which isn't always as restrictive as the stories we tell ourselves about getting a job).
Open Thread: June 2021

Welcome, Caleb! I'm always excited to see people with unusual specialties on the Forum; every bit of expertise matters.

Shouldn't 'Effective Altruism' be capitalized?

Given that the navigation bar text refers to an article with a capitalized title, I think its current capitalization is correct and consistent. It's possible that "action" in "Take action" should be capitalized, though; I'll give that some thought.

Shouldn't 'Effective Altruism' be capitalized?

I've made it official CEA policy that we always use lower case.

My reasoning:

  1. Like others have said here, I think of EA as a philosophy/system of thought, along the lines of "liberalism" or "utilitarianism", rather than as a formalized religion or political institution, like "Christianity" or "the Democratic Party".
  2. I agree that there are specific values and practices that are inherent to EA, but it's difficult to point to any one thing that would "qualify" a person or organization as officially being "part of EA" vs. not.
    1. There is such a thing as a "registered Democrat", but "registered EA community member" isn't really a thing. You can set up a profile on the EA Hub, but so can literally anyone; this doesn't confer any official privileges.
    2. See why I don't like the term "effective altruist" (including my reply to Michael Aird's comment, which pulls out the difference between that question and the capitalization question).
  3. Regarding institutions, I consider many organizations to be some degree of "EA-aligned" even if they have nothing whatsoever to do with our movement. I think of this alignment as a spectrum, rather than a binary thing where an org does or doesn't "count".
    1. For example, consider a global health charity that gets serious consideration from GiveWell but doesn't quite pass the bar to be a "Standout Charity". Do the charity's employees have "EA jobs"?
      1. Their work is aligned with EA's mission, and presumably involves what most people in EA would consider a "promising cause area". Given this, I see the "EA jobs" question as beside the point.

In your example, of the person who uses Charity Navigator, I still think the binary distinction isn't helpful:

  • Is this person trying to do more good? Yes.
  • Is the strategy they're using likely to help them do substantially more good than they would otherwise? Probably not.
  • Is their personal philosophy generally in line with EA? Impossible to tell from this single statement.
  • Would they fit in well at an EA meetup? Depends on how open they are to considering different ideas.
  • Should they be hired for a role at a very EA-aligned organization? Depends on their skills and other elements of fit; their confusion about overhead is just one small fact about them. I wouldn't want them writing curricula about effective giving, but they might be a great accountant.

...and so on.

***

I got off-topic at the end there, but to return to my main point: 

I think that capitalizing "effective altruism" makes it seem more like a binary thing (you count or you don't, you have this identity or you don't) and less like a spectrum (many people are aligned with EA to some degree, but no one is a perfect exemplar of every EA principle). But given how complicated the above questions can get, I think "spectrum" is a better fit than "binary".

Buck's Shortform

I think that this may make sense / probably makes sense for receiving payment for book reviews. But I think I'd be opposed to discouraging people from just posting book summaries/reviews/notes in general unless they do this. 

Yep, agreed. If someone is creating e.g. an EAIF-funded book review, I want it to feel very "solid", like I can really trust what they're saying and what the author is saying. 

But I also want Forum users to feel comfortable writing less  time-intensive content (like your book notes). That's why we encourage epistemic statuses, have Shortform as an option, etc.

(Though it helps if, even for a shorter set of notes, someone can add a note about their process. As an example: "Copying over the most interesting bits and my immediate impressions. I haven't fact-checked anything, looked for other perspectives, etc.")

Buck's Shortform

I worry sometimes that EAs aren’t sufficiently interested in learning facts about the world that aren’t directly related to EA stuff.

I share this concern, and I think a culture with more book reviews is a great way to achieve that (I've been happy to see all of Michael Aird's book summaries for that reason).

CEA briefly considered paying for book reviews (I was asked to write this review as a test of that idea). IIRC, the goal at the time was more about getting more engagement from people on the periphery of EA by creating EA-related content they'd find interesting for other reasons. But book reviews as a push toward levelling up more involved people // changing EA culture is a different angle, and one I like a lot.

One suggestion: I'd want the epistemic spot checks, or something similar, to be mandatory. Many interesting books fail the basic test of "is the author routinely saying true things?", and I think a good truth-oriented book review should check for that.

Open Thread: May 2021

This is a good comment! Upvoted for making a reasonable challenge to a point that often goes unchallenged.

There are trade-offs to honesty and cooperation, and sometimes  those virtues won't be worth the loss of impact or potential risk. I suspect that Holden!2013 would endorse this; he may come off as fairly absolutist here, but I think you could imagine scenarios where he would, in fact, miss a family event to accomplish some work-related objective (e.g. if a billion-dollar grant were at stake).

I don't know how relevant this fact is to the Gates case, though.

While I don't have the time to respond point-by-point, I'll share some related thoughts:

  • My initial comment was meant to be descriptive rather than prescriptive: in my experience, most people in EA seem to be aligned with Holden's view. Whether they should be is a different question. 
    • I include myself in the list of those aligned, but like anyone, I have my own sense of what constitutes "standard", and my own rules for when a trade-off is worthwhile or when I've hit the limit of "trying". 
    • Still, I think I ascribe a higher value than most people to "EA being an unusually kind and honest community, even outside its direct impact".
  • I don't understand what would result from an analysis of "what types of unethical behavior could be condoned":
    • Whatever result someone comes up with, their view is unlikely to be widely adopted, even within EA (given differences in people's ethical standards)
    • In cases where someone behaves unethically within the EA community, there are so many small details we'll know about that trying to argue for any kind of general rule seems foolhardy. (Especially since "not condoning" can mean so many different things -- whether someone is fired, whether they speak at a given event, whether a given org decides to fund them...)
    • In cases outside EA (e.g. that of Gates), the opinion of some random people in EA has effectively no impact.

All in all, I'd rather replace questions like "should we condone person/behavior X?" with "should this person X be invited to speak at a conference?" or "should an organization still take grant money from a person who did X?" Or, in a broader sense, "is it acceptable to lie in a situation like X if the likely impact is Y?"

Family Empowerment Media: Results, reflections, and plans after 6+ months

As always, it's a real pleasure to read FEM's writeups.  I hope to see another report in ~6 months as I'm figuring out my giving for the year  :-)

Editing Festival: Results and Prizes

We definitely want to add wiki contribution data to user profiles (it's a natural accompaniment to the lists of users' posts and comments), though the timeline for that project isn't yet established.

Editing Festival: Results and Prizes

(For people who don't follow the link to the festival announcement post, it seems worth noting that these prizes are "In the form of donations to an EA Funds-eligible charity of your choice.")

I've added this detail to the post for clarity. Thanks!

I'll check in with EA Funds about whether they have any plans to add those options.

Load More