About 4 months ago I applied for funding from EAIF to do community building in Romania and got rejected. I now realize that my application was indeed a bit premature, since I am relatively new to EA. I have talked to many people since then and got a lot of great feedback that helped me redefine and improve my plans. Recently, however, I learned of a couple of factors that counted against my application and that I'm struggling a bit to understand and I would like your opinion on.

Disclaimer: For the sake of brevity, I will use language in this post that some might find a bit blunt. I want to make it clear that the people who gave me the feedback were careful to be kind and polite with their words and that I appreciate their honesty and care and hold no personal grudges. I am not posting this here to accuse anyone, but to hear more diverse opinions in hopes that this will help me figure out more clearly how I can improve.

One of the factors that counted against my application was my blog, which was considered to be “too edgy” and to make me look like a "naive utilitarian". This is a bit surprising to me because I have received positive feedback about my blog from other EAs, so I'm wondering if this opinion is really universal. You can find my blog here, and the article that was viewed as most problematic was this one:

UPDATE: I have now updated the article to make it less graphic. You can find the new version here.

To be clear, I argue in favor of meat offsets but against rape offsets, and I only talk about rape at all because the main argument used by vegans so far against my defense of meat offsets has been the analogy with rape. I am perfectly aware that this is a sensitive topic and I wouldn't have brought it up gratuitously just to be edgy. Perhaps I should have made this more explicit. The trigger warning was added later in response to Tobias Leenaert's comments (which were otherwise supportive of my main thesis).

In case you agree that this article is too "edgy", I would be curious to understand your arguments for supporting this view so that I can make better decisions in the future.

  • Is it the fact that I use the word "rape" right in the subtitle? If it was only brought up further in the text would it have been OK?
  • Is it the fact that I respond to the rape analogy at all? Should I have used a less sensitive example, like maybe "murder offset"?

Another article that caused some uproar in some secular humanist groups was this one:

This article wasn’t explicitly mentioned in the feedback, however, so I won’t focus on it here. I just thought a few concrete examples might give people a better feel of the level of controversy in my blog. To be clear though, if I consider my last 10 articles, I can’t think of any really controversial ones besides these two. 

Regarding the perception that my blog promotes “naive utilitarian” views, I am really curious to understand what could prompt this consideration. I've written an entire series of posts explaining my position on utilitarianism and criticizing most of the views that are generally framed as "naive". If you spot anything specific in this article or any other that makes you feel that my view on it is naive, please do let me know because I would really like to address it.

After attending a few conferences and completing the virtual programs, I am now more aware that EAs tend to be very academic and private about their discussions. Most content is posted in obscure specialized forums/blogs like LessWrong, Slate Star Codex, EA Forum, etc. The language is generally quite technical and the content seems to mostly target other members of the community rather than people who have never heard of EA. It is a very different world from that of atheist bloggers, YouTubers, etc, which I'm more familiar with and might have influenced me.

My articles are probably somewhere in between these two worlds. Most of them target the general public and I roam in many different circles. If anything, most of the feedback I get is that my language is too complex and academic. So yeah, I still have a lot of progress to make in terms of defining exactly what my target audience is and how technical I want to be, but at this point I am more worried about sounding too technical than sounding too informal or simplified. This has all made me wonder:

  • Do I sound like a naive utilitarian to EAs because of my unusually informal and non academic style? Or is it some other reason?
  • Is it bad for group organizers to have their own blogs and speak publicly about controversial topics?
  • Should group organizers be more low profile and avoid expressing their opinion on divisive issues unrelated to EA?
  • Should EAs avoid having their public persona associated with other, unrelated and potentially controversial causes or lifestyles such as LGBT+, polyamory, sex positivity, decriminalization of drugs and sex-work, etc? Or should EAs have the freedom to express themselves openly about these issues? What should be an ideal trade-off?
  • In order to become a better and less controversial community organizer, is it sufficient if I tone down the level of controversy in my articles from now on? Or is it too much a liability to have these articles still listed in my blog at all?

I understand that being a full-time group organizer on an EAIF grant brings certain responsibilities and requires some diplomacy and PR skills, and I will be the first to admit that I've had my edgy atheist phase, but this was a long time ago and I am more than willing to tone down the provocativeness of my articles if this is important. I have been doing that for a long time anyway. In fact, the edginess of the atheist movement is one of the very factors that made me gravitate more towards EA in the last few years.

I do think it's important, however, to balance diplomacy with intellectual courage. One of the things that attracted me to the EA movement was the boldness of philosophers such as Peter Singer who, as you probably know, co-founded the Journal of Controversial Ideas and defends unpopular views such as the euthanasia of severely disabled newborns. I feel this commitment to freethinking and intellectual openness is an extremely important value for any movement that hopes to make progress rather than stagnate and become dogmatic.

If you think I'm erring too much on the side of openness at the expense of caution and PR concerns, please feel free to share any tips you might have to help me make better trade-offs in the future. Thank you :)

22

0
0

Reactions

0
0

More posts like this

Comments4
Sorted by Click to highlight new comments since: Today at 4:38 PM

I’m definitely not the best person to give feedback on this, but I’ll just briefly share a few thoughts:

  1. I’ve heard that EA grant makers often have relatively little time to review grant applications. This may or may not be true for EAIF, but supposing it is, even yellow flags like that offset article might cause a reviewer to quickly become pessimistic about providing grants (for some of the following reasons).
  2. I would have recommended not using the example of rape; murder offsets probably would have been a better alternative. I only skimmed the post, but it really didn’t help that towards the beginning you make the seemingly-intentionally-controversially phrased point about “[Sometimes rape is permissible… you probably agree deep down. That is, if it is to prevent more rape.]” This ordering (saying “you probably agree” before clarifying “if it were in some twisted trolley problem scenario”) and phrasing (e.g., “deep down…”) is needlessly controversy-inviting. To be honest, to me these kinds of details genuinely do reflect some lack of perspective/room-reading or rhetorical finesse, regardless of whether you ultimately oppose the idea of rape offsets. (It also very much gives me flashbacks to the infamous Robin Hanson post, which really hurt his reputation and reflected a similar lack of perspective…) This may not be such a problem if I am personally evaluating your character, but:
  3. Grant making probably is justified for being cautious about downside risks, including when it comes to optics risks. “EA grant makers fund writer of blog that callously discusses ‘rape offsets’” might be a very unfair social media characterization, but fairness doesn’t really matter, and I can’t be confident it won’t get pulled into some broader narrative attacking—however fairly or unfairly—EA overall. (Speaking as someone who’s never analyzed grant applications) I suspect you would have to have a really good case for potential upside to make it worth spending a few extra hours analyzing those optics risks, and in the end there may (or may not?) be plenty of other people to fund instead.

As for your overall blog, I haven’t read it, but I wouldn’t be surprised if it is otherwise good, and I’m glad to see a blog discussing moral issues. But rape is a topic that needs to be treated with a lot of care and caution, and probably should be avoided when it is just being used to make a point separate from rape.

There is a very severe potential downside if many funders think in this manner, which is that it will discourage people from writing about potentially important ideas. I’m strongly in favor of putting more effort and funding into PR (disclaimer that I’ve worked in media relations in the past), but if we refuse to fund people with diverse, potentially provocative takes, that’s not a worthwhile trade-off, imo. I want EA to be capable of supporting an intellectual environment where we can ask about and discuss hard questions publicly without worrying about being excluded as a result. If that means bad-faith journalists have slightly more material to work with, than so be it.

Hi Harrison, thanks for the detailed feedback. I take your point and I will try to edit the article to make it less "shocking", since there seems to be consensus that I went a bit too far. There are a couple of considerations though that I think might be relevant for me to raise. I'm not trying to excuse myself, but I do think they provide more context and might help people understand why I wrote the article in this style.

  1. The only reason why I brought up rape in the article at all was that the vegans who opposed my argument for meat offsets explicitly used the "rape argument": If meat offsets are permissible then rape offsets are permissible. Rape offsets can't possibly be permissible because rape is too shocking and disgusting. Therefore, meat offsets are not permissible. I could have used another example rather than rape, but the whole point of using the rape analogy is that rape is shocking and invokes moral disgust in most of us. If I used another example, I felt it would be dishonest. I was afraid that it would look like I am evading their argument. Don't you think this is a reasonable concern? How could I avoid the "rape" example without looking like I'm evading their argument?
  2. Sometimes, when discussing moral philosophy, I find it important to evoke some degree of shock or disgust. Again, I write to the general public, and there are many people in the lay audience who casually espouse relativistic views, but if I press them hard enough with sufficiently gory examples, they agree that certain things are "objectively bad". But I guess I have now learned to avoid gender-based violence in my examples. I do think "murder" is not good enough though. Would "torture" or "child decapitation" be OK? Or still too much?
  3. I don't have an official diagnosis but I've been called autistic many times in my life and after reading about the topic I concluded that people might be on to something. I'm a typical nerdy IT guy who struggled with social skills for most of my youth, and I've never been particularly good at reading how people feel, predicting how they're gonna react to certain things, etc. With time, however, I've learned how to mask my weirdness by following certain simple algorithms and I now have a very active social life and I would say I get into conflict less often than the average person. I'm just saying this because I've noticed that people often assume that I use shocking language because I am callous and insensitive and don't care about how people feel, but the truth is that I do care about how they feel, I just fail to predict how they will feel. Sure, at the end of the day the harm caused might be the same, but I do hope people will see this as an attenuating circumstance because a person whose heart is in the right place is more likely to improve their behavior in the future in response to feedback.
  4. Another factor that I think is tricky is cultural differences. So far my experience of EA is that the cultural norms are very much dictated by the US/UK, because this is where most people are and it's only natural that these cultural norms come to dominate. In progressive circles in the US/UK it seems that it has become mainstream to believe that people should be protected from any potential discomfort, that words can be violence, etc. Jonathan Haidt calls this the coddling of the American mind. I don't want to argue here that people should be more resilient. I haven't read enough about this so I prefer to refrain from judgement and remain agnostic. But my point is that in other cultures this phenomenon is not so mainstream. In Romania for example people are pretty callous comparatively and there is some tolerance for this in the culture, even in progressive circles. My girlfriend for example read the article and didn't say anything about it being too graphic or callous. Sure, American culture influences both Brazil (where I'm originally from) and Romania (where I've been living for 8 years), but I think it's a bit unfair to expect people from everywhere to flawlessly respect the sensibilities of the anglosphere without ever making any mistake.
  5. Besides, there is the even more complicated issue of subcultures. I'm into extreme metal and gory horror movies, and people in these communities have a different relationship to violence. We tend to talk about it callously and the less triggerable you are the more metal you are. I've also been active in the secular humanist movement, where many people identify as "free-speech fundamentalists", and there is more tolerance for "offensive content" than in other more mainstream progressive movements. Because of the strong rationalist component of EA, I've always assumed it overlapped a lot with secular humanism, but lately I am realizing that this overlap is a bit smaller than I assumed it to be.

Again, I'm not saying these things to excuse myself, I appreciate the feedback and I will adjust my behavior in response to it. At the end of the day I will always have to adopt one imperfect set of cultural norms or another, so if I want to get more involved in EA I might as well adopt EA norms. I guess I just felt the need to explain where I'm coming from so that people don't leave with the impression that I'm a callous person who doesn't care about how others feel. I made a mistake, I failed to predict that my article would be seen as too callous to EAs,  and hopefully with this new data point I can recalibrate my algorithms and minimize the chances that I will make the same mistake in the future. I cannot promise I will never make a mistake again, but I still hope my reputation won't be forever damaged by one honest mistake.

PS: What is the infamous Robin Hanson post? I'm curious :)

I did not find your blog post about moral offsetting offensive or insensitive. Your explanations of evolutionary reasons for why we have such visceral reactions to rape, to me, addressed the moral outrageousness with which rape is associated. Also, you clearly stated your own inability of being friends with a rapist. Philosophical discussions are probably better when they include sensitive issues so they can have more of an impact on our thought processes. 

Also, there was another post on here in which it was mentioned that a community organizer could come off as cult-like, dogmatic, or like they're reading from a script. So, for that reason, it's probably better to not try to censor yourself.

Regarding the content of the post about moral offsetting itself:

The problem I have with the thought experiments in which rape could lead to less rape overall is that there shouldn't be such situations where such hard choices are presented. While that is true that ideally we shouldn't have to face such hard decisions, I am probably underestimating the power of situations and overestimating my and others' ability to act.

As someone who is turned off by the idea of moral offsetting, your zombie-rapists thought experiment helped me to see the utility of offsetting a bit more clearly. As you said, offsetting the harm to animals is not ideal, but if it is more effective towards getting us to a reality that is more ideal for animals, then it is valuable.

Curated and popular this week
Relevant opportunities