Aaron Gertler

I moderate the Forum, and I'd be happy to review your next post.

I'm a full-time content writer at CEA. I started Yale's student EA group, and I've also volunteered for CFAR and MIRI. I spend a few hours a month advising a small, un-Googleable private foundation that makes EA-adjacent donations. I also play Magic: the Gathering on a semi-professional level and donate half my winnings (more than $50k in 2020) to charity.

Before joining CEA, I was a tutor, a freelance writer, a tech support agent, and a music journalist. I blog, and keep a public list of my donations, at aarongertler.net.

Sequences

Effective Altruism Handbook: Motivation Series

Comments

[Podcast] Rob Wiblin on self-improvement and research ethics

I talked about this with JP (the Forum's lead developer). We feel as though the alternatives authors have, combined with the table of contents offering an easy way to skip sections, make this feature relatively low-value compared to other things we can work on. 

But in the abstract, it's a good idea, and I like what it does for searchability if it lets people avoid linking to external docs. We'll keep an eye on the idea for later.

Hilary Greaves: The collectivist critique of the EA movement

I agree that there are relatively few people in EA looking at anything I'd consider "the impact of collective action," but I also think this makes sense given the reality of EA's size and influence. We are a few thousand people (perhaps 10,000) spread across multiple continents. Working on advice for individuals (or even your nearest government) seems much more likely to bear fruit than figuring out which actions are most promising for large groups of people to take together.

I would be interested to see more work on questions like "what are the best predictors of a viral Change.org petition?", where there's a chance of leveraging large groups who aren't connected to EA at all.

The work that a few EA-aligned people are doing to attempt to influence a parliamentary vote (using a relatively novel approach) may be of some interest to you, as well as this successful ballot initiative which involved some degree of public advocacy.

vaidehi_agarwalla's Shortform

I don't think the Forum is likely to serve as a good "group discussion platform" at any point in the near future. This isn't about culture so much as form; we don't have Slack's "infinite continuous thread about one topic" feature, which is also present on Facebook and Discord, and that seems like the natural form for an ongoing discussion to take. You can configure many bits of the Forum to feel more discussion-like (e.g. setting all the comment threads you see  to be "newest first"), but it feels like a round peg/square hole situation.

On the other hand, Slack seems reasonable for this!

Megaproject Management

Thanks for sharing this! I have a few large projects on the horizon, and while none of them are within five orders of magnitude of the examples mentioned here, I feel like I can still identify some snags to watch out for (in particular, doing more research to ensure the end results won't be "underutilized").

Ending a news habit

I've used a lot of different tools to separate from websites I want to spend less time on. One recent "tool" that's been especially helpful was actually a blog post: 7 Months Without Junk Media.

Excerpts:

For the last 7 months I have disconnected from news, social media, videogames, forums, web surfing, and streaming video. 

The positive effects are cumulative. I feel like I have more time. I am more creative. I even feel less conformist. I invent new ideas instead of parroting the news story of the day. This effect is especially cumulative. I get more sensitive to how media brainwashes me as the months of abstinence accumulate.

[...]

Are you afraid that by abstaining from news you might miss out on important events?

No. In the last 7 months, exactly two important newsworthy events happened: COVID-19 and Black Lives Matter. I did not miss out on either of them [...] If anything, ignoring the news has focused my attention on the local action it is within my power to do, such as volunteering for a local neighborhood watch.

I read this post once per week. It hasn't stopped me from using "junk media" completely (that was never my goal), but it has led to my spending less time on Twitter and games, and more time on books. (At least, those things have happened, and I think the post is partly responsible.)

Ends vs. Means: To what extent should we be willing to sacrifice nuance and epistemic humility when communicating with people outside EA?

This is a good question with no clear answer. Overall, I tend to be more of a pro-marketing person than what I perceive as the EA average.

However, I think there are a few good reasons to lean in the direction of "keeping more nuance and epistemic humility" that we might underappreciate (given the more obvious benefits of the other approach):

  1. Many of the world's most successful/thoughtful people will be unusually attracted to movements that are more nuanced and humble. For example, I wouldn't be surprised if Dustin Moskovitz were drawn to GiveWell partly because they weren't as "salesy" as other charities (though I don't know the details of this interaction, so maybe I'm wrong). People with a lot of resources or potential are constantly being approached by people who want to sell them on a new idea; a non-salesy EA could be very appealing in this context.
  2. If some groups within EA try to be more salesy, it could spark internal competition. Right now, I think EA does a pretty good job of being a neutral community, where someone who wants to contribute will get questions like "what are you interested in?", rather than lots of pitches on particular organizations/causes. If marketing becomes more prevalent, we might lose some of that collaborative atmosphere.
  3. Nuance and humility are also marketable! One type of pitch for EA that I think is undervalued: We are the social movement that values honesty more than any other movement. Political parties lie. Populist movements lie. We don't lie, even if the truth isn't exciting.
  4. EA doesn't actually need giant walls of statistics to market itself. As you noted, the Singer/Lindauer argument doesn't discuss statistics or prioritization. But it still does a good job of making one of the central arguments of EA, in a way that can be extended to many other causes. Even if this particular empathetic approach wouldn't work as well for longtermist orgs, pretty much every EA org is driven by a simple moral intuition that can be approached in a simple way, as Singer/Lindauer did with "what if it were your child?"
SiebeRozendal's Shortform

Aww yes, people writing about their life and career experiences! Posts of this type seem to have some of the best ratio of "how useful people find this" to "how hard it is to write" -- you share things you know better than anyone else, and other people can frequently draw lessons from them.

The ten most-viewed posts of 2020

Rather than chasing down a bunch of stats to put into a comment, the best step for us seems to be putting up a "statistics" page that we occasionally update. I'm planning to add something like this in the next ~2 months.

Privacy as a Blind Spot: Are There Long Term Harms in Using Facebook, Google, Slack etc.?

If you came to believe that the risk of data loss through admin error on a self-hosted system were lower than the breach-risk at Google, would that change your view on the convenience-security trade-off?

I don't think it's about total likelihood of an event, but expected impact of said event. And because I have very weak priors about the likelihood of either event, getting any new information would probably change my view about the trade-off in some way. 

But changing my view on the trade-off looks more like "I now think EA funders should be open to spending $X per year on this rather than $Y" or "I now think groups with risk profile X should now be willing to switch even if their activity drops 10%", rather than coming to believe something more sweeping and conclusive about the entire topic.

The Folly of "EAs Should"

I don't disagree with elements of this stance -- this kind of career advice is probably strongly positive-EV to share in some form with the average medical student. 

But I think there's a strong argument for at least trying to frame advice carefully if you have a good idea of how someone will react to different frames. And messages like "tell people X even if they don't like hearing it" can obscure the importance of framing. I think that what advice sounds like to people can often be decisive in how they react, even if the most important thing is actually giving the good advice.

Load More