Hide table of contents

There is much controversy over political and cultural disputes, issues that are frequently labeled as 'culture wars'. The idea of solving these disputes, as in researching and debating and coming to an informed position on these things, seems to be frequently (and quite correctly) regarded as intractable and not really worthwhile for serious EAs.

But it might be useful to narrow our scope from all of society down to the local environment, immediately surrounding the EA movement. In this context, we have robust, direct information on how political and cultural ideologies are variously beneficial or harmful for the things we understand. Thus, we can (hopefully) get a narrow window of observations that are broadly agreed-upon. The scope will be limited, but the reliability will be high.

(There is another thing we can look at, which is the benefits or harms from people's political ideologies within the EA movement, but I won't touch that because it's clearly not going to be robust and uncontroversial.)

Using this framework, first let's list the local impacts of left-wing politics:

Impacts on the EA movement:

  • Perhaps the most common criticism of EA is that the movement does not collectively align with radical anticapitalist politics
  • Other people object to EAs taking careers in high finance and consulting, based on the idea that these careers are a major part of the market economy and therefore are immoral
  • A college group of disability activists disrupted an Effective Altruism event at University of Victoria
  • An autistic rights activist condemned EA by alleging incompatibility between cost-benefit analysis and disability rights
  • Key EA philosopher Peter Singer has been viewed negatively by left-wing academia after taking several steps to promote freedom of speech (Journal of Controversial Ideas, op-ed in defense of Damore)
  • Key EA philosopher Peter Singer was treated with hostility by left-wing people for his argument on sex with severely cognitively disabled adults
  • The EA movement has been viewed negatively by left-wing people due to some overlap with the forum LessWrong and its members, to which they are heavily hostile, partially for political reasons

Impacts on poverty relief:

  • Less support for Givewell recommendations and similar efforts, in order to focus on political activism

Impacts on animal welfare:

  • Left wing narratives tend to push more support for animal rights and, to a lesser extent, animal welfare. This part is good, though it is not prosecuted with nearly the same vigor as left wing narratives on other matters of social justice

Impacts on existential risk:

  • Useful x-risk researchers, organizations and ideas are frequently viewed negatively by leftists inside and outside academia, due to association with the forum LessWrong and sometimes for direct hostility to their political views
  • Google dissolved its AI ethics board due mainly to hostility against its most right-wing member

Overall this looks bad; almost all impacts are negative. Moving on to right-wing politics:

Impacts on the EA movement:

  • There seems to be some implicit dismissal of EA as being too "blue-tribe" and liberal, leading conservatives to be disinterested in the movement from the outset. This is not easily observed, but seems to be the best explanation for the lack of conservative uptake and interest
  • Peter Singer has been treated with hostility by traditional conservatives for his arguments on after-birth abortion and zoophilia
  • MacAskill's interview with Joe Rogan provoked hostility from viewers because of an offhand comment/joke he made about Britain deserving punishment for Brexit
  • William MacAskill received pushback from right-wing people for his argument in favor of taking refugees

Impacts on poverty relief:

  • There is opposition to Givewell charities based on the idea that the recipients of Givewell aid are low-IQ and overpopulating
  • There is some attachment to charities which help one's own country or town, rather than global charities like Givewell recommendations

Impacts on animal welfare:

  • General opposition, even hostility to animal rights and welfare

Impacts on existential risk:

  • None yet, that I can think of

With right-wing politics, again we see a consistent trend where political culture interferes with the immediate context of Effective Altruism.

Left-wing political culture seems to be a deeper, more pressing source of harm. However, if we are trying to judge how good/bad ideologies are in a general sense, this judgment has less external validity because it depends on where EA and its projects are located in geographic, cultural and economic space.

One might wonder, to be fair: are there any cases where people's political apathy or centrism damages their relationship with Effective Altruism? As far as I can tell, there are none.

So in summary, while broader questions of political culture may be too difficult to answer with any reasonable amount of research labor, we have one cluster of clear data points telling us that relatively moderate political culture (i.e.: in between the American political mainstreams) or political apathy are most beneficial. This post is not meant to invalidate or debunk broader arguments about political culture writ large. However, it should be particularly interesting to those who prefer robust evidence in lieu of naive expected-value estimates.

Comments27
Sorted by Click to highlight new comments since:

An implicit problem with this sort of analysis is that it assumes the critiques are wrong, and that the current views of Effective Altruism are correct.

For instance, if we assume that systemic change towards anti-capitalist ideals actually is correct, or that taking refugees does actually have long run bad effects on culture, then the criticism of these views and the pressure on the community from political groups to adopt these views is actually a good thing, and provides a net-positive benefit for EA in the long term by providing incentives to adopt the correct views.

My understanding of how EA typically responds to anti-capitalist critiques of EA:

  • EAs are very split on capitalism, but a significant minority aren't fans of it, and the majority think (very) significant reforms/regulations of the free market in some form(s) are justified.
  • The biggest difference on economics between EA and left-wing political movements is EA sees the market liberalization worldwide as a or the main source of increasing quality of life and material standard of living, and an unprecedented decrease in absolute global poverty in human history, in the last several decades. So EAs are likelier to have confidence in free(r) market principles as fundamentally good than most other left-leaning crowds.
  • Lots of EAs see their participation in EA as the most good they can do with their private/personal efforts, and often they're quite active in politics, often left-wing politics, as part of the good they do as their public/political efforts. So, while effective giving/altruism is the most good one can do with some resources, like one's money, other resources, like one's time, can be put towards efforts aimed at systemic change. Whenever I've seen this pointed out, the distinction has mysteriously always been lost on anti-capitalist critics of EA. If there is a more important and different point they're trying to make, I'm missing it.
  • A lot of EAs make the case that the kind of systemic change they are pursuing is what they think is best. This includes typical EA efforts, like donating to Givewell-recommended charities. The argument is these interventions are based on robust empirical evidence, and are demonstrably so cost-effective, such that they improve the well-being of people in undeveloped or developing countries, and their subsequent ability to autonomously pursue systemic change in their own societies. There are also a lot of EAs focused on farm animal welfare they believe is the most radically important form of systemic change they can focus on. As far as I'm aware, there are no existing significant or prominent public responses to these arguments from a left-wing perspective. Any such sources would be appreciated.
  • A lot of anti-capitalist criticism of EA is how it approaches the eradication of extreme global poverty. In addition to not addressing EA's arguments for how their current efforts are aiming at affecting systemic change in the world's poorer/poorest countries, anti-capitalist critics haven't offered up much in the way of concrete, fleshed-out, evidence-based approaches to systemic change that would motivate EA to adopt them.
  • Anti-capitalist critics are much likelier than EA to see the redistribution of accumulated wealth through private philanthropy as having been accumulated unjustly and/or through exploitative means. Further, they're likelier to see relative wealth inequality within a society as a fundamentally more important problem, and thus see directly redressing it fundamentally higher priority, than most of the EA community. Because of these different background assumptions, they're likelier to perceive EA's typical approaches to doing the most good as insufficiently supportive of democracy and egalitarianism. As a social movement, EA is much more like a voluntary community of people who contribute resources privately available to them, than it is a collective political effort. A lot of EAs are active in political activity aimed at systemic change, publicly do so as part and parcel with their EA motivations, and are not only willing but actively encourage public organization and coordination of these efforts among EAs and other advocates/activists. That anti-capitalist critics haven't responded to these points seems to hinge on how they haven't validated the distinction between use of personal/private resources, and public/political resources.

There isn't much more EA can do to respond to anti-capitalist critics until anti-capitalist critics broach these subjects. The ball is in their court.

I was trying to figure out why I dislike this post so much, and I think this is why - the assumption that people in EA are correct and everyone else is incorrect, combined with a lack of depth when explaining why certain things topics are criticized, and missing several important critiques. (Normally I don't mind incomplete posts, but I think the tone combined with the list not being very good really bothered me.)

the assumption that people in EA are correct and everyone else is incorrect

This is a misunderstanding. Perhaps you might re-read the OP more carefully?

missing several important critiques.

Feel free to add to the list.

I would take your response more seriously if you hadn't told everyone who commented that they had misunderstood your post.

If everyone's missing the point, presumably you should write the point more clearly?

I just assume that EAs are correct about the EA things that we are doing. Of course that is a rational assumption to make. Otherwise you are just throwing yourself into a pit of endless self-doubt. It does not need to be argued that EAs know best about EA, just as it does not need to be argued that climatologists know best about the climate, psychologists know best about psychology and so on.

I think this is only true with a very narrow conception of what the "EA things that we are doing" are. I think EA is correct about the importance of cause prioritization, cause neutrality, paying attention to outcomes, and the general virtues of explicit modelling and being strategic about how you try to improve the world.

That's all I believe constitutes "EA things" in your usage. Funding bednets, or policy reform, or AI risk research, are all contingent on a combination of those core EA ideas that we take for granted with a series of object-level, empirical beliefs, almost none of which EAs are naturally "the experts" on. If the global research community on poverty interventions came to the consensus "actually we think bednets are bad now" then EA orgs would need to listen to that and change course.

"Politicized" questions and values are no different, so we need to be open to feedback and input from external experts, whatever constitutes expertise in the field in question.

I think EA is correct about the importance of cause prioritization, cause neutrality, paying attention to outcomes, and the general virtues of explicit modelling and being strategic about how you try to improve the world

Yes, and these things are explicitly under attack from political actors.

Funding bednets, or policy reform, or AI risk research, are all contingent on a combination of those core EA ideas that we take for granted with a series of object-level, empirical beliefs, almost none of which EAs are naturally "the experts" on

When EAs are not the experts, EAs pay attention to the relevant experts.

"Politicized" questions and values are no different, so we need to be open to feedback and input from external experts

This is not about whether we should be "open to feedback and input". This is about whether politicized stances are harmful or helpful. All the examples in the OP are cases where I am or was, in at least a minimal theoretical sense, "open to feedback and input", but quickly realized that other people were wrong and destructive. And other EAs have also quickly realized that they were being wrong and destructive.

An implicit problem with this sort of analysis is that it assumes the critiques are wrong,

We know them to be wrong in basic logical terms as attacks against EA - none of these things require that EA itself change or die, just cause areas or other ideas within EA. This point has been made repeatedly to the point of consensus.

For instance, if we assume that systemic change towards anti-capitalist ideals actually is correct, or that taking refugees does actually have long run bad effects on culture, then the criticism of these views and the pressure on the community from political groups to adopt these views is actually a good thing, and provides a net-positive benefit for EA in the long term by providing incentives to adopt the correct views

You missed the point of the post. I'm making no judgment on whether e.g. anticapitalism or refugees are good or bad. If you do that, then you're already playing the game of making sweeping judgments about society writ large, which I'm not doing. I'm simply analyzing the direct impact on EA capital.

Internal debate within the EA community is far better at reaching truthful conclusions than whatever this sort of external pressure can accomplish. Empirically, it has not been the case that such external pressure has yielded benefits for EAs' understanding of the world.

Internal debate within the EA community is far better at reaching truthful conclusions than whatever this sort of external pressure can accomplish. Empirically, it has not been the case that such external pressure has yielded benefits for EAs' understanding of the world.

It can be the case that external pressure is helpful in shaping directions EVEN if EA has to reach conclusions internally. I would put forward that this pressure has been helpful to EA already in reaching conclusions and finding new cause areas, and will continue to be helpful to EA in the future.

I haven't seen any examples of cause areas or conclusions that were discovered because of political antipathy towards EA. The limiting factor is robust evidence and analysis of cause areas.

I haven't seen any examples of cause areas or conclusions that were discovered because of political antipathy towards EA.

Veganism is probably a good example here. Institutional decisionmaking might be another. I don't think that political antipathy is the right way to view this, but rather just the general political climate shaping the thinking of EAs. Political antipathy is a consequence of the general system that produces both positive effects on EA thought, and political antipathy towards certain aspects of EA.

Veganism is probably a good example here.

Who has complained that EA is bad because it ignored animals? EAs pursued animal issues on their own volition. Peter Singer has been the major animal rights philosopher in history. Animal interests are not even part of the general political climate.

Institutional decisionmaking might be another.

Looking at 80k Hours' writeup on institutional decision making, I see nothing with notable relevance to people's attacks on EA. EAs have been attacked for not wanting to overthrow capitalism, not wanting to reform international monetary/finance/trade institutions along the lines of global justice, and funding foreign aid that acts as a crutch for governments in the developing world. None of these things have a connection to better institutional decision making other than the mere fact that they pertain to the government's structure and decisions (which is broad enough to be pretty meaningless). 80k Hours is looking at techniques on forecasting and judgment, drawing heavily upon psychology and decision theory. They are talking about things like prediction markets and forecasting that have been popular among EAs for a long time. There are no citations and no inspirations from any criticisms.

The general political climate does not deal with forecasting and prediction markets. The last time it did, prediction markets were derailed because the general political climate created opposition (the Policy Analysis Market in the Bush era).

It's possible I'm wrong. I find it unlikely that veganism wasn't influenced by existing political arguments for veganism. I find it unlikely that a focus on institutional decision making wasn't influenced by existing political zeitgist around the problems with democracy and capitalism. I find it unlikely that the global poverty focus wasn't influenced by the existing political zeitgeist around inequality.

All this stuff is in the water supply, the arguments and positions have been refined by different political parties moral intuitions and battle with the opposition. This causes problems when there's opposition to EA values, sure, but it also provides the backdrop from which EAs are reasoning from.

It may be that EAs have somehow thrown off all of the existing arguments, cultural milleu, and basic stances and assumptions that have been honed for the past few generations, but that to me represents more of a failure of EA if true than anything else.

I find it unlikely that veganism wasn't influenced by existing political arguments for veganism.

I find it obvious. What political arguments for veganism even exist? That it causes climate change? Yet EAs give more attention to the suffering impacts than to the climate impacts.

I find it unlikely that a focus on institutional decision making wasn't influenced by existing political zeitgist around the problems with democracy and capitalism.

The mere idea that "there are problems with democracy and capitalism" is relatively widespread, not unique to leftism, and therefore doesn't detract from my point that relatively moderate positions (which frequently acknowledge problems with democracy and capitalism) have better impacts on EA than extreme ones. The leftist zeitgeist is notably different and even contradictory with what EAs have put forward, as noted above.

I find it unlikely that the global poverty focus wasn't influenced by the existing political zeitgeist around inequality.

People have focused on poverty as a target of charity for millennia, and people who worry about inequality (as opposed to worrying about poverty) are more stubborn towards EA ideas and demands.

it also provides the backdrop from which EAs are reasoning from.

There is an opportunity cost in not having a better backdrop. Even in a backdrop of political apathy, there would not be less information and less ideas (broadly construed) in the public sphere, just different ones and presented differently.

There is an opportunity cost in not having a better backdrop.

Seems plausible.

[Right-wing] Impacts on existential risk:

None yet, that I can think of

I'd add: "A general disbelief in the possibility of AGI/TAI due to theological convictions on the nature of sentience and intelligence."

As someone who actively promotes EA in a school, we definitely do get far more pushback from left-wing (esp. socialist) students than right-wing ones. In fact, from my fundraising experience, I'm 60% confident that conservative students are more financially generous towards EA per capita than liberal ones.

A problem with this post is that its conclusion that the "left" poses more "risk" is based on the number of individual perceived objections from the left. However, even if this were true, this conflates the number of separate issues with some attempt at a measure of the overall "magnitude" of risk, without taking into account the number of people complaining based on each objection, and/or the "intensity"/impact of their complaints. Which, as Halffull points out, could in any case even be a positive impact if they're identifying a real problem with EA.

I don't want to be overly pedantic, but there are also inconsistencies in this post, which make its conclusion even about the number of objections appear stronger than it is. The total number of objections from the left is increased by the separate listing of several instances of closely-related criticisms of Peter Singer (autism rights, disability rights, and others). In contrast, in the "problems with the right wing" section, similar complaints of abortion-related objections and zoophilia are listed in the same point. This inconsistency increases the apparent number of left-wing objections, a number on which the author then bases their conclusion.

I also think that research is lacking, as a recent podcast (80000 Hours? Someone help me if you can remember; it's really hard to search content on podcasts) suggested that the rise of extreme right-wing populist nationalist politics is creating risks in the nuclear warfare space.

Another thing is that the EA survey consistently suggests that most EAs are left-wing. Anecdotally, most of those I know seem to be reformist, centre-left. Both the statistics and my experience suggests that the centre left, perhaps those who are disillusioned with more extreme leftist positions such as proposals for revolutionary communism, may be a significant source of people coming into EA -- often bringing with them motivation, experience of community organising and other useful skills.

I think, for good or bad, EA is much more vulnerable to pressure from the left-wing because the institutions we interface with and the locations where most EAs are based lean that way.

A problem with this post is that its conclusion that the "left" poses more "risk" is based on the number of individual perceived objections from the left. However, even if this were true, this conflates the number of separate issues with some attempt at a measure of the overall "magnitude" of risk, without taking into account the number of people complaining based on each objection, and/or the "intensity"/impact of their complaints

I am aware of the magnitudes of costs and that is what drives the overall judgment. Your assumption that I'm only thinking about the # of points is incorrect. In my experience the severity and intractability of leftist hostility is systematically worse than right-wing hostility, I just did not go into detail about that in the OP. In any case, this judgment is secondary and not central to the post.

I also think that research is lacking, as a recent podcast (80000 Hours? Someone help me if you can remember; it's really hard to search content on podcasts) suggested that the rise of extreme right-wing populist nationalist politics is creating risks in the nuclear warfare space.

I'd say the EA movement does not have a refined, close engagement and understanding of nuclear deterrence/safety in the same way that it does for its top 3 cause priorities. This becomes a more generic judgment on whose political ideology is better in general; we could just as easily judge people on the full variety of political issues. Which of course is a valid project, but beyond the scope of this post.

Another thing is that the EA survey consistently suggests that most EAs are left-wing. Anecdotally, most of those I know seem to be reformist, centre-left. Both the statistics and my experience suggests that the centre left, perhaps those who are disillusioned with more extreme leftist positions such as proposals for revolutionary communism, may be a significant source of people coming into EA

Sure, but that's within the space of relatively moderate politics as I explicitly define it.

Anecdotally, I'd say I know several EAs who have shifted in the last few years from libertarianism or liberalism to conservatism, and some of them have been willing to be vocal about this in EA spaces. However, just as many of them have exited EA because they were fed up with how they weren't taken seriously. I'd estimate of the dozens of EAs I know personally quite well, and the hundreds I'm more casually familiar with, 10-20% would count as 'conservative,' or at least 'right-of-centre.' Of course, this is a change from what was before apparently zero representation for conservatives in EA. Unfortunately, I can't provide more info, as conservatives in EA are not wont to publicly discuss their political differences with EAs, because they don't feel like their opinions are taken seriously or are respected.

Upvoted for starting an interesting and probing conversation. I do have several nitpicks.

Perhaps the most common criticism of EA is that the movement does not collectively align with radical anticapitalist politics

Maybe I've just stopped paying attention to basic criticisms of EA along these lines, because every time all the best responses from EA to these criticisms are produced in an attempt at a good-faith debate, the critics apparently weren't interested in an actually serious dialogue that could change EA. Yet in the last couple years while the absolute amount of anticapitalism has increased, I've noticed less criticism of EA on the grounds it's not anticapitalist enough. I think EA has begun to have a cemented reputation as a community that is primarily left-leaning, and certainly welcomes anticapitalist thought, but won't on the whole mobilize towards anticapitalist activism at least until anticapitalist movements themselves produce effective means of 'systemic change.'

An autistic rights activist condemned EA by alleging incompatibility between cost-benefit analysis and disability rights

I'm skeptical friction between EA and actors who misunderstand so much has consequences bad enough to worry about, since I don't expect the criticism would be taken so seriously by anyone else to the point it would have much of an impact at all.

Key EA philosopher Peter Singer has been viewed negatively by left-wing academia after taking several steps to promote freedom of speech (Journal of Controversial Ideas, op-ed in defense of Damore)
Key EA philosopher Peter Singer was treated with hostility by left-wing people for his argument on sex with severely cognitively disabled adults
Peter Singer has been treated with hostility by traditional conservatives for his arguments on after-birth abortion and zoophilia

I'm also concerned about the impact of Singer's actions on EA itself, but I'd like to see more focused analysis exploring what the probable impacts of controversies around Singer are.

MacAskill's interview with Joe Rogan provoked hostility from viewers because of an offhand comment/joke he made about Britain deserving punishment for Brexit
William MacAskill received pushback from right-wing people for his argument in favor of taking refugees

Ditto my concerns about controversies surrounding Singer for Will as well, although I am generally much less concerned with Will than Singer.

Useful x-risk researchers, organizations and ideas are frequently viewed negatively by leftists inside and outside academia

I know some x-risk reducers who think a lot of left-wing op-eds are beginning to create a sentiment in some relevant circles that a focus on 'AI alignment as an existential risk' is a pie-in-the-sky, rich techie white guy concern about AI safety, and more concern should be put on how advances in AI will affect issues of social justice. The concern is diverting the focus of AI safety efforts away from how AGI poses an existential risk for what are perceived as more parochial concerns could be grossly net negative.

Impacts on existential risk:
None yet, that I can think of

Depending on what considers an x-risk, popular support for right-wing politicians that pursue counterproductive climate change or other anti-environmental policies, or who tend to be more hawkish, jingoistic, and nationalistic in ways that will increase the chances of great-power conflict, negatively impacts x-risk reduction efforts. It's not clear that this has a direct impact on any EA work focused on x-risks, though, which is the kind of impacts you meant to assess.

Left-wing political culture seems to be a deeper, more pressing source of harm.

I understand you provided a caveat, but I think this take still misses a lot.

  • If you asked a lot of EAs, I think most of them would say right-wing political culture poses a deeper potential source of harm to EA than left-wing political culture. Left-wing political culture is only a more pressing source of harm because EA is disproportionately left-leaning, so the social networks EAs run in, and thus decision-making in EA, are more likely to be currently impacted by left-wing political culture.
  • It misses what counts as 'left-wing political culture,' especially in Anglo-American discourse, as the left-wing landscape is rapidly and dramatically shifting. While most EAs are left-leaning, and a significant minority would identify with the basket socialist/radical/anti-capitalist/far-left, a greater number, perhaps a plurality, would identify as centre-left/liberal/neoliberal. From the political right, and from other angles, both these camps are 'left-wing.' Yet they're sufficiently different that when accuracy matters, as it does regarding EA, we should use more precise language to differentiate between centre-left/liberal and radical/anticapitalist/far-left 'left-wing political culture.' For example, in the U.S., it currently seems the 'progressive' political identity can apply to everyone from a neoliberal to a social democrat to a radical anticapitalist. On leftist forums I frequent, liberals are often labelled as 'centrists' or 'right-wing,' and are perceived as having more in common with conservative and moderates than they do anti-capitalists.
  • Anecdotally, I would say the grassroots membership of the EA movement is more politically divergent, less moderate, and generallly "to the left" of flagship EA organizations/institutions, in that I talk to a lot of EAs who feel EA is generally still too much to the right for their liking, and actually agree with and wish EA would be much more in line with changes left-wing critics would demand of us.
I'm skeptical friction between EA and actors who misunderstand so much has consequences bad enough to worry about, since I don't expect the criticism would be taken so seriously by anyone else to the point it would have much of an impact at all.

Assuming that one cares about their definition of "disability rights" - i.e., disabled people have a right to lots of healthcare and social services, and any de-emphasis for the sake of helping more able people is a violation - their criticism and understanding of EA are correct. In the public eye, it's definitely catchy, this sort of suspicion of utilitarian cost-benefit analysis runs deep. Some weeks ago the opinion journalist Dylan Matthews mentioned that he wanted to write an article about it, and I expect that he would give a very kind platform to the detractors.

Depending on what considers an x-risk, popular support for right-wing politicians that pursue counterproductive climate change or other anti-environmental policies, or who tend to be more hawkish, jingoistic, and nationalistic in ways that will increase the chances of great-power conflict, negatively impacts x-risk reduction efforts. It's not clear that this has a direct impact on any EA work focused on x-risks, though, which is the kind of impacts you meant to assess.

Right, for that broad sort of thing, I would direct people to my Candidate Scoring System: https://1drv.ms/b/s!At2KcPiXB5rkvRQycEqvwFPVYKHa

This post seems somewhat misleading to me. The main question I have is: how are left wing and right wing being defined? The post seems to be defining left-wing as extreme radical anti-capitalism. As another commenter noted, if we define left wing more broadly to include centre-left political culture, then EA seems quite symbiotic with left-wing culture.

How are we defining right wing? The post seems to be defining right-wing as some kind of centre(ish)-right political culture. If we define right-wing more broadly include extreme right-wing culture which includes extreme nationalist views and an overall hostility to foreign aid, then right-wing political culture seems much less friendly to EA. EDIT: The OP does include these points, but seems to minimize them, i.e. extreme right wing culture on my understanding does not have "some attachment" to local charities, but thinks that people in "shithole" countries deserve to be in the condition they are, and would actively oppose helping these people through foreign aid.

This post seems to be defining left-wing as extreme anti capitalism while excluding moderate left-wing culture and defining right wing as excluding extreme right-wing views. The result is that left-wing culture appears more hostile than right-wing culture.

If we are comparing extreme left-wing view with less extreme right-wing views then this seems plausible, but if we compare extreme left-wing culture with extreme right-wing culture, then this is much less clear, and if we compare left-wing culture broadly defined with right wing culture broadly defined, then EA seems more at home on the left, as shown by the fact that most EAs are left leaning.

The post seems to be defining left-wing as extreme radical anti-capitalism.

Lots of the examples are not anti-capitalist (at least, not necessarily, though there may be overlap).

The post seems to be defining right-wing as some kind of centre-right political culture.

No, it just seems to be the case that far-right views are relatively rare in EA's local environment, perhaps rarer more generally, so I haven't got any examples. I'd definitely agree that far-right views would do more damage to the EA movement than more moderate ones (person for person), but it is a bit speculative as there has not been as much interaction. I already stated that the observation that EA suffers more harm from the left wing than from the right wing has more to do with the local context than the ideologies themselves, so I'm not sure what there is left to say.

extreme right wing culture on my understanding does not have "some attachment" to locate charities, but thinks that people in "shithole" countries deserve to be in the condition they are,

I have not seen any reason to believe this. Though they do oppose foreign aid for more instrumental reasons.

EA seems more at home on the left

My main point is not about left wing vs right wing, it's about moderation vs extremism. Between the political mainstreams, center left and center right views are equally fine for EA. I don't think that splitting hairs over which kind of extremism is worse is important.

I agree with the commentators that it is worthwhile keeping in mind that some of the political pressure may in fact be correct, but I also feel that this post is valuable because it helps highlight the kinds of pressures that we are subject to.

I agree with the commentators that it is worthwhile keeping in mind that some of the political pressure may in fact be correct,

The motivating ideology of the pressure may be right or wrong, but that's beside the point. The scope of this post is simply to look at the objective costs and benefits on the EA sphere, because that is much more robust than people's opinions on whose ideology is correct in a broader sense.

More from kbog
Curated and popular this week
Relevant opportunities