James Ozden

3153Joined Oct 2020

Bio

Currently doing social movement and protest-related research at Social Change Lab, an EA-aligned research organisation I've recently started.

Previously, I completed the 2021 Charity Entrepreneurship Incubation Program. Before that, I was the Director & Strategy lead at Animal Rebellion + in the Strategy team at Extinction Rebellion UK, working on movement building for animal advocacy and climate change.

My blog (often EA related content)

Feel free to reach out on james.ozden [at] hotmail.com or see a bit more about me here

Sequences
1

The Farm Animal Welfare Newsletter

Comments
194

How come you think that? Maybe I'm biased from spending lots of time with Charity Entrepreneurship folks but I feel like I know a bunch of talented and entrpreneurial people who could run projects like the ones mentioned above. If anything, I would say neartermist EA has a better (or at least, longer) track record of incubating new projects relative to longtermist EA!

My guess is that this new neartermist-only EA would not have the resources to do a bunch of things which EA currently does--it's not clear to me that it would have an actively maintained custom forum, or EAGs, or EA Funds. James Snowden at Open Phil recently started working on grantmaking for neartermist-focused EA community growth, and so there would be at least one dedicated grantmaker trying to make some of this stuff happen. But most of the infrastructure would be gone.

This paragraph feels pretty over the top. When you say "resources" I assume you mean that neartermist EAs wouldn't have enough money to maintain the Forum, host EAGs, run EA Funds, etc. This doesn't feel either that accurate, partially on the account that I don't think those infrastructure examples are particularly resource or labour-intensive, or sufficient money is available to make them happen:

  • Forum: Seems like 1-2 people are working FTE on maintaining the forum. This doesn't seem like that much at all and to be frank, I'm sure volunteers could also manage it just fine if necessary (assuming access to the underlying codebase).
  • EA Funds: Again, 1-2 FTE people working on this, so I think this is hardly a significant resource drain, especially since 2.5 of the funds are neartermist.
  • EAGs: Yes, definitely more expensive than the above two bits of infrastructure, but also I know at least one neartermist org is planning a conference (tba) so I don't think this number will fall to 0. More likely it'll be less than it is right now, but one could also reasonably think we currently have more than what is optimally cost-effective.

Overall it seems like you either (1) Think neartermist EA has access to very few resources relative to longtermist EA or (2) that longtermist EA doesn't have as much direct work to spend money on so by default they spend a higher % of total funds on movement infrastructure?

For (1): I would be curious to hear more about this, as seems like without FTX, the disparities in neartermist and longtermist funding aren't huge (e.g. I think no more than 10x different?). Given that OP / Dustin are the largest funders, and the longtermist portfolio of OP is likely going to be around 50% of OP's portfolio, this makes me think differences won't be that large without new longtermist-focused billionaires. 

For (2): I think this is largely true, but again I would be surprised if this led to longtermist EA being willing to spend 50x more than neartermist EA (I could imagine a 10x difference). That said, a few million for neartermist EA, which I think is plausible, would cover a lot of core infrastructure.

Funnily enough, I think EA does worse than other communities / movements I'm involved with (grassroots animal advocacy & environmentalism). My partner and other friends (women) have often complained about various sexist issues when attending EA events e.g. men talking over them, borderline aggressive physical closeness, dismissing their ideas, etc., to the point that they doesn't want to engage with the community. Experiences like this rarely, if ever, happen in other communities we hang out in. I think there are a few reasons for why EA has been worse than other communities in my cases:

  • I think our experiences differ on animal issues as when groups /movements professionalise, as has been happening over the past decade for animal welfare, the likelihood that men will abuse their positions of power increases dramatically. At the more grassroots level, power imbalances often aren't stark enough to lead the types of issues that came out in the animal movement a few years back. EA has also been undergoing this professionalisation and consolidation of power, and seems like the article above highlights the negative consequences of that. 
  • As has been noted many times, EA is currently about 70% male, whilst environmentalism/animal advocacy is majority women.  I would be fairly confident that a more balanced gender ratio would mean less misogyny towards women. 
  • Some EAs have a kind of "anti-woke" sentiment to the point where I actually think it could be fairly damaging e.g. it causes people to think issues related to race, gender, nationality etc aren't important at all. I think it would be pretty valuable if everyone read a few core texts on things like racism, sexism, ableism, etc. to actually understand the every-day experiences of people facing various forms of discrimination and bigotry. 

Could you clarify your last paragraph I quoted then? Im genuinely unsure why you used the word “allegedly”, if you do believe the far-right ideas have causes large amounts of harm?

I also wasn’t clear on what you meant by second or third-hand in this context, so clarifying that would also help me understand your position better.

I'm confused by this - you say: 

"the Blank Slate doctrine that all individuals must have exactly equal abilities, preferences, & values, and any empirical evidence challenging this doctrine must be instantly slandered)"

But when I look up Blank Slate doctrine on Google - I find nothing remotely related to this claim. Instead I see a lot of something like this:

"According to blank slate theory, the mind is completely blank at birth. From there, education, environment, and experiences – which are external, as well as material and/or immaterial – shape the child's process of development." 

Also it's not clear to me what you mean by "far-left" - do you have more specific labels in mind? I consider myself fairly left-wing but have never heard of this doctrine, and highly doubt that my even more lefty friends would endorse anything like your claim above. 

Examples include every 'idealistic' communist regime that degenerated into a totalitarian nightmare, internal genocide, & surveillance state.

In this, you're only considering far-left regimes that are highly state-controlled, rather than both libertarian and left-wing societies (e.g. anarchistic). If you look for these examples, you might actually find things are going pretty well internally (e.g. Zapatistas or Rojava) and that these societies don't seek to eradicate sub-groups of a population - which is pretty uniform for far-right ideologies.

trying to tally up the relative historical harms that allegedly resulted, usually second- or third-hand, from holding certain views [emphasis mine].

Can you clarify what you mean here? I'm trying to be charitable but seems like you're trying to cast doubt on the fact that far-right ideologies have caused harm to people, or diminish the harm that has been caused. Would appreciate you specifying exactly what you meant as this could easily be interpreted as this pretty reprehensible view.

This seems too simple. The email was going to be made public, which would (and did) cause harm to many people, so an apology could try to mitigate the harm of his words being shared in 2022. His apology largely failed on this point (in my opinion).

Leaving aside some object-level stuff about Bostrom's views, I still think the apology could be much better without any dishonesty on his part. This is somewhat subjective but things that I think could have been better:

  • Don't frame the apology at the beginning as almost purely instrumental i.e. not like "I will get smeared soon, so I want to get ahead of the game". This makes everything come across as less genuine. 
  • "What about eugenics? Do I support eugenics? No, not as the term is commonly understood." - This is just not a useful thing to mention in an apology about racism, or at least, not in this way. Usually, if someone says "Don't think of an elephant" then you do think of an elephant. The consequence is now people are probably more likely to think there is a link between Bostrom and eugenics than if this was written differently.
  • And some other points that Habiba mentioned in her post e.g. "I am deeply uncomfortable with a discussion of race and intelligence failing to acknowledge the historical context of the ideas’ origin and the harm they can and have caused."

In my opinion it just highlights some basic misunderstandings about communication and our society today, which (I think) was proven by the fairly widespread negative backlash to this incident.

Responding to one point only: I think more poorly of Bostrom being a “grand strategist for humanity” if he thinks the apology he wrote was the best one he honestly could have, given the large backlash and potential impact on his work and EA. Seems like quite a large mistake IMO (especially if the investigation now launched by Oxford leads to him losing his post)

Link to article mentioning the investigation: https://www.thetimes.co.uk/article/blacks-more-stupid-than-whites-wrote-oxford-don-8gsj8l0wf

edit: added the word honestly to be clearer

- this New Yorker piece, with Zoe explaining "My recommendations were not intended to catch a specific risk, precisely because specific risks are hard to predict”  but still saying ... “But, yes, would we have been less likely to see this crash if we had incentivized whistle-blowers or diversified the portfolio to be less reliant on a few central donors? I believe so.”"My recommendations were not intended to catch a specific risk, precisely because specific risks are hard to predict”  but still saying ... “But, yes, would we have been less likely to see this crash if we had incentivized whistle-blowers or diversified the portfolio to be less reliant on a few central donors? I believe so.”

To be fair, this seems like a reasonable statement on Zoe's part:

  • If we had incentivised whistle-blowers to come forward around shady things happening at FTX, would we have known about FTX fraud sooner and been less reliant on FTX funding? Very plausibly yes. She says "likely" which is obviously not particularly specific, but this would fit my definition of likely.
  • If EA had diversified our portfolio to be less reliant on a few central donors, this would have also (quite obviously) mean the crash had less impact on EA overall, so this also seems true.

Basically, as other comments have stated, you do little to actually say why these proposed reforms are, as you initially said, bad or would have no impact. I think if you're going to make a statement like:

"it seems clear proposed reforms would not have prevented or influenced the FTX fiasco" 

You need to actually provide some evidence or reasoning for this, as clearly lots of people don't believe it's clear. Additionally, if it feels unfair to call Zoe "not epistemically virtuous" when you're making quite bold claims, without any reasoning laid out, then saying it would be too time-intensive to explain your thinking.

For example,  you say here that you're concerned about what democratisation actually looks like, which is a fair point and useful object-level argument, but this seems more like a question of implementation rather than the actual idea is necessarily bad.

Load More