About three hours ago I posted a brief argument about why one of the US presidential candidates is strongly preferable to the other on five criteria of concern to EAs: AI governance, nuclear war prevention, climate change, pandemic preparedness and overall concern for people living outside the United States. I concluded by urging readers who were American citizens to vote accordingly, and encourage anyone whom they might potentially influence to do so. After about an hour the post had a karma of 23; shortly after that, it was removed from the front page and relegated to 'personal blogposts'. Not surprisingly, at that point it started to attract less attention.

Not having heard of 'personal blogposts', I checked the description, which suggested that they were appropriate for '[bullet point] topics that aren't closely related to EA; [bullet point] topics that are difficult to discuss rationally; [bullet point] topics of interest to a small fraction of the forum's readers (e.g. local events)'. Frankly, I can't see how my blogpost fit any of these descriptors. It focused on the candidates' positions on issues of core concern to EAs, and from the fact that it had a karma score of 23 after an hour, it was obviously of interest to the message board's readership. The remaining possibility--unless there were other unstated criteria--is that it was judged to be on a topic that was 'difficult to discuss rationally'. 

If so, I think that's a troubling commentary on EA, or the moderator's conception of EA. My post was clearly partisan, but I don't think any reasonable observer would have called it a rant. This election will almost surely make more difference to most of the causes that EAs hold dear than any other event this year--perhaps any other event this decade. Shouldn't the EA community, if anybody, be able to discuss these issues in a reasonably rational manner? I'd be grateful for a response from the moderator justifying the decision to exclude them.

50

14
11

Reactions

14
11
Comments33


Sorted by Click to highlight new comments since:

You can read our politics policy here.

Political issues are clearly relevant to improving the world. However, in our experience, we’ve seen that partisan political discussion tends to have a strong polarizing effect on public forums; it consumes a lot of a community’s attention and can lead to emotionally charged arguments. Overall, we think the EA Forum will be healthier, and better-positioned to achieve its goals, if we limit the space given to political topics.

lilly
32
12
8
1

This is weird to me. There are so many instances of posts on this forum having a “strong polarizing effect… [consuming] a lot of the community’s attention, and [leading] to emotionally charged arguments.” The several posts about Nonlinear last year strike me as a glaring example of this.

US presidential candidates’ positions on EA issues are more important to EA—and our ability to make progress on these issues—than niche interpersonal disputes affecting a handful of people. In short, it seems like posts about politics are ostensibly being held to a higher standard than other posts. I do not think this double standard is conducive to healthy discourse or better positions the EA community to achieve its goals.

I agree that this is inconsistent (looks like Ben's Nonlinear post is front page). But my conclusion is that community drama should also be made less visible except to those who opt in, not vice versa. The separate section for community posts was a decent start

No shade to the mods, but I'm just kind of bearish on mods' ability to fairly determine what issues are "difficult to discuss rationally," just because I think this is really hard and inevitably going to be subject to bias. (The lack of moderation around the Nonlinear posts, Manifest posts, Time article on sexual harassment, and so on makes me think this standard is hard to enforce consistently.) Accordingly, I would favor relying on community voting to determine what posts/comments are valuable and constructive, except in rare cases. (Obviously, this isn't a perfect solution either, but it at least moves away from the arbitrariness of the "difficult to discuss rationally" standard.)

This seems a question of what the policy is, not of judgement re how to apply it, in my opinion.

The three examples you gave obviously are in the category of "controversial community drama that will draw a lot of attention and strong feelings", and I trust the mod's ability to notice this. The question is whether the default policy is to make such things personal blog posts. I personally think this would be a good policy, and that anything in this category is difficult to discuss rationally. I do also consider the community pane a weaker form of low visibility, so there's something here already, but I would advocate for a stronger policy.

Another category is "anything about partisan US politics", which I don't think is that hard to identify, is clearly hard to discuss rationally, and in my opinion is reasonable to have a policy of lowering the visibility of.

I don't trust karma as a mechanism, because if the post is something that people have strong feelings about, and many of those feelings are positive (or at least, righteous anger style feelings), then posts often get high karma. Eg I think the Nonlinear posts got a ton of attention, in my opinion were quite unproductive and distracting, got very high karma, and if they had been less visible I think this would have been good

Upvoted, but I don't think one could develop and even-handedly enforce a rule on community-health disputes that didn't drive out content that (a) needed to be here, because it was very specifically related to this community or an adjacent one, and (b) called for action by this community. So I think those factors warrant treating community-health dispute content as frontpage content, even though it lets a lot of suboptimal content slip through.

I think you may have a point on "positions on EA issues" narrowly defined -- but that is going to be a tough boundary to enforce. Once someone moves to the implied conclusion of "vote for X," then commenters will understandably feel that all the reasons not to vote for X are fair commentary whether or not they involve "positions on EA issues." [ETA: I say narrowly defined because content about how so-and-so is a fascist, or mentally unstable, or what have you is not exactly in short supply. I have little reason to believe that anyone is going to change their minds about such things from reading discussions on the Forum.]

There's also a cost to having a bunch of partisan political content -- the vast majority of which would swing in one direction for the US -- showing up when people come to EA's flagship public square. We have to work with whoever wins, and tying ourselves to one team or the other more than has already happened poses some considerable costs. There is much, much less broader risk on community-health disputes like Nonlinear (one can simply choose not to read them).

Yeah, just to be clear, I am not arguing that the "topics that are difficult to discuss rationally" standard should be applied to posts about community events, but instead that there shouldn't be a carveout for political issues specifically. I don't think political issues are harder to discuss rationally or less important.

Could you provide examples of political discussions on the EA Forum that appear to have negatively impacted the forum’s environment or impaired its ability to achieve its objectives? While I find this plausible, I’d also expect the EA Forum to be one of the most conducive spaces online for constructive political discourse.

My understanding is that the forum’s primary goal is to support discussions relevant to effective altruism and facilitate the coordination of related projects. Given that politics is highly relevant to these aims, I believe there should be a strong(er) justification for any restrictions on political topics.

I think the repeated guilt-by-association posts pointing out that someone in EA associated with someone who has some right wing views are pretty negative.

Which posts? (you don't need to list them, just briefly describe them so I can find them myself)

While My experience at the controversial Manifest 2024 (and several related posts) was (were) not explicitly about policies or politicians, I think it's largely the underlying political themes that made it so heated.

Manifest was advertised on Forum and the controversial speakers were IIRC largely advertised and invited guests. Some of the talks were at least adjacent to the objected-to views.

That seems a significantly tighter connection than "someone in EA associated with someone who has some right wing views."

Thanks! Yeah, I thought maybe this was what Larks was referring to. Putting to one side the question of whether that was a valuable discussion or not, I wouldn't put that in the same category as OP's post. The Manifest discussion was about whether an organisation such as Manifest should give a platform to people with views some people consider racist, OP's post is an analysis of the policy platform of a leading candidate in what is arguably the world's most important election. I wouldn't describe the former discussion as 'political' in the same way that I would describe the OP's post. But perhaps others see it differently?

Could you provide examples of political discussions on the EA Forum that appear to have negatively impacted the forum’s environment or impaired its ability to achieve its objectives?

As far as I remember, the political discussions have been quite civilized on the EA Forum. But I think this is because of the policies and culture the EA Forum has. If political discussions were a lot more frequent, the culture and discussion styles could get worse. For example, it might attract EA-adjacent people or even outsiders to fight their political battles on the EA Forum. Maybe this can be solved by hiring additional moderators though.

Also, politics can get a lot of attention that would be better spend elsewhere. For example this post about Trump generated 60 comments, and I am not sure if it was worth it.

So you think so far it's mostly been OK? If that's the case, and if it's plausible that high-quality discussions about politics would be valuable, shouldn't we lean towards loosening the policy and seeing what happens? 

Best case, good discussions happen and the forum does a better job of meeting its objective. Worst case, bad discussion happens, but then it should be pretty simple to tighten the policy up and no lasting harm would be done. 

Not sure what to make of it, but one of 80k hrs top recommendations is Government and policy - here it seems like several top careers could need to consider what party to work for. I might agree that discussions about who to vote for might not be high priority (although I think Rob Wiblin made a really good point that in "swing districts" voting might be really high value in expectation). That said, there might be a trade-off for many people, perhaps even EA as a whole between whether to try to make those issues that are still not partisan (like AI perhaps) stay non-partisan and using our resources to galvanize a political faction around issues that are partisan.
 

The personal blogpost category is pretty clear. On hover it says:

There are very few topics that are as difficult to discuss rationally as US partisan politics. It very blatantly obviously is the kind of topic that tends to destroy the sanity of large swaths of otherwise smart and well-reasoned people. What is a topic that would be more deserving of "difficult to discuss rationally"? 

I am not sure whether I agree with the categorization here, but I don't think there is any hypocrisy or inconsistency in the EA Forum in making this decision.

That might be right -- but then wouldn't it be a major problem for EA if it were unable to discuss rationally one of the most important factors determining whether it achieved its goals? This election is likely to have huge implications not only for how (or whether) the world manages a number of x-risks to a minimally satisfactory extent, but also for many other core EA concerns such as international development, and probably farm animals too (a right-wing politician with a deregulatory agenda, for whom 'animals' is a favourite insult, is scarcely going to have their welfare at heart).

I think difficult to discuss rationally, and unable to discuss rationally are two completely different things that it's important not to conflate. It just seems very obviously true that posts on US politics are more likely to lead to drama, fighting, etc. There are definitely EAs who are capable of having productive and civil conversations about politics, I've enjoyed several such, and find EAs much better for this than most groups. But public online forums are a hard medium to have such discussions. And I think the moderating team have correctly labelled any such posts as difficult to discuss rationally. Whether you agree with making them less visible is up to you, I personally think it's fairly reasonable

I disagree that it is “difficult to discuss rationally”. I agree that most discussions of political issues outside of this forum are emotionally driven, soldier mentality, ad hominem etc.

But EA forum participants have shown great restraint and depth in discussing a range of sensitive topics (as you acknowledge). I think we could provide a strong example of how this is done.

I guess the filtering should weigh the risk of things getting uncivil against the importance of the topic/area. Hot button social issues and things coming close to personal drama seem to have low importance. Politics seems to be high importance, to me.

It sounds like you agree it's difficult, you just think EA Forum participants will successfully rise to the challenge?

Which idk, maybe, maybe not, seems high variance - I'm less optimistic than you. And making things personal blog posts makes them less visible to new forum users (hidden by default, I think?) but not to more familiar users who opt in to seeing personal blog posts, which seems great for higher quality conversations. So yeah, idk, ultimately the level of filtering here is very mild and I would guess net good

I need to consider the visibility of the personal blog posts. If they are really ~invisible one possibility could be combining politics with the community section.

I personally set them to equal visibility to normal posts, so this doesn't matter to me. But I don't know the stats for how many forum users do so. If basically all forum users have them invisible then I would consider this stronger censorship

I don't think the benefits would outweigh the enormous costs, no. I think there is space in EA for election discussion, and indeed things like the Personal Blogposts are a decent fit for that, as are other spaces that are higher trust (like sessions at EA Global). It's not like this topic is banned, its just disincentivized, which seems very reasonable to me.

FWIW my impression is that EAG events are not generally considered to be a good fit for election discussion.  

I'm hesitant to support giving the moderators license to decide which discussions of which candidates should get default front-page visibility. There are also possible legal implications to selective elevation of explicitly partisan political content to default visibility in light of EVF US's status as a 501(c)(3) charity and the limitations on partisan political activity that come with that status.

Are there even allegations of selective moderation of political content?

I just commented here on the original post. I mention that we can separate discussion of the issues without discussing the elections themselves, but I think we should not be overly delicate about mentioning links to politicians and parties.

I give some further examples to emphasize the close connection between high-value issues and likely US policies in the coming Trump administration. 

This is what my frontpage looks like

 

You probably have customized your feed to show personal blogposts. The default feed has them hidden.

 

Doesn't look like it

Ah, um, confusingly the default for personal is "Hidden", and what you've done is effectively changed your settings for Personal Blogs to "Neutral."

Thanks -- that's odd. The 'elephant' post isn't showing up on mine.

Curated and popular this week
 ·  · 5m read
 · 
[Cross-posted from my Substack here] If you spend time with people trying to change the world, you’ll come to an interesting conundrum: Various advocacy groups reference previous successful social movements as to why their chosen strategy is the most important one. Yet, these groups often follow wildly different strategies from each other to achieve social change. So, which one of them is right? The answer is all of them and none of them. This is because many people use research and historical movements to justify their pre-existing beliefs about how social change happens. Simply, you can find a case study to fit most plausible theories of how social change happens. For example, the groups might say: * Repeated nonviolent disruption is the key to social change, citing the Freedom Riders from the civil rights Movement or Act Up! from the gay rights movement. * Technological progress is what drives improvements in the human condition if you consider the development of the contraceptive pill funded by Katharine McCormick. * Organising and base-building is how change happens, as inspired by Ella Baker, the NAACP or Cesar Chavez from the United Workers Movement. * Insider advocacy is the real secret of social movements – look no further than how influential the Leadership Conference on Civil Rights was in passing the Civil Rights Acts of 1960 & 1964. * Democratic participation is the backbone of social change – just look at how Ireland lifted a ban on abortion via a Citizen’s Assembly. * And so on… To paint this picture, we can see this in action below: Source: Just Stop Oil which focuses on…civil resistance and disruption Source: The Civic Power Fund which focuses on… local organising What do we take away from all this? In my mind, a few key things: 1. Many different approaches have worked in changing the world so we should be humble and not assume we are doing The Most Important Thing 2. The case studies we focus on are likely confirmation bias, where
 ·  · 2m read
 · 
I speak to many entrepreneurial people trying to do a large amount of good by starting a nonprofit organisation. I think this is often an error for four main reasons. 1. Scalability 2. Capital counterfactuals 3. Standards 4. Learning potential 5. Earning to give potential These arguments are most applicable to starting high-growth organisations, such as startups.[1] Scalability There is a lot of capital available for startups, and established mechanisms exist to continue raising funds if the ROI appears high. It seems extremely difficult to operate a nonprofit with a budget of more than $30M per year (e.g., with approximately 150 people), but this is not particularly unusual for for-profit organisations. Capital Counterfactuals I generally believe that value-aligned funders are spending their money reasonably well, while for-profit investors are spending theirs extremely poorly (on altruistic grounds). If you can redirect that funding towards high-altruism value work, you could potentially create a much larger delta between your use of funding and the counterfactual of someone else receiving those funds. You also won’t be reliant on constantly convincing donors to give you money, once you’re generating revenue. Standards Nonprofits have significantly weaker feedback mechanisms compared to for-profits. They are often difficult to evaluate and lack a natural kill function. Few people are going to complain that you provided bad service when it didn’t cost them anything. Most nonprofits are not very ambitious, despite having large moral ambitions. It’s challenging to find talented people willing to accept a substantial pay cut to work with you. For-profits are considerably more likely to create something that people actually want. Learning Potential Most people should be trying to put themselves in a better position to do useful work later on. People often report learning a great deal from working at high-growth companies, building interesting connection
 ·  · 1m read
 · 
I wanted to share a small but important challenge I've encountered as a student engaging with Effective Altruism from a lower-income country (Nigeria), and invite thoughts or suggestions from the community. Recently, I tried to make a one-time donation to one of the EA-aligned charities listed on the Giving What We Can platform. However, I discovered that I could not donate an amount less than $5. While this might seem like a minor limit for many, for someone like me — a student without a steady income or job, $5 is a significant amount. To provide some context: According to Numbeo, the average monthly income of a Nigerian worker is around $130–$150, and students often rely on even less — sometimes just $20–$50 per month for all expenses. For many students here, having $5 "lying around" isn't common at all; it could represent a week's worth of meals or transportation. I personally want to make small, one-time donations whenever I can, rather than commit to a recurring pledge like the 10% Giving What We Can pledge, which isn't feasible for me right now. I also want to encourage members of my local EA group, who are in similar financial situations, to practice giving through small but meaningful donations. In light of this, I would like to: * Recommend that Giving What We Can (and similar platforms) consider allowing smaller minimum donation amounts to make giving more accessible to students and people in lower-income countries. * Suggest that more organizations be added to the platform, to give donors a wider range of causes they can support with their small contributions. Uncertainties: * Are there alternative platforms or methods that allow very small one-time donations to EA-aligned charities? * Is there a reason behind the $5 minimum that I'm unaware of, and could it be adjusted to be more inclusive? I strongly believe that cultivating a habit of giving, even with small amounts, helps build a long-term culture of altruism — and it would