L

lc

122 karmaJoined Nov 2022

Comments
22

lc
1y20
11
14

(1) I don't think there's any evidence that EA is an inherently male activity, and we shouldn't assume such.

There's at least some evidence, in that it's a tradition that is currently mostly participated in by men.  I don't know exactly what you mean by "inherently" or what brand of evidence you're looking for, but it's not really relevant to the discussion that the cause for the difference in interest be  biological or social or whatever. These sorts of gender ratios seem hard to "correct" when it comes to C.S. departments and Magic the Gathering tournaments, and my guess is that with EA it will be similar. If someone wants to prove me wrong then I'd welcome the attempt.

(2) Even if EA is a male-leaning activity (which I don't necessarily agree with per above), there's a lot of value in finding ways to involve the remaining ~50% of the population, so surely we'd want to find ways to make it less male-leaning on the margin. 

Well, that depends, doesn't it? If "making EA less male-leaning on the margin" means coming up with fewer WELLBYs, then plausibly "making EA less male" means making EA less able to accomplish its goals. 

Often what I've seen academic departments do to attract women into STEM is to exaggerate the interpersonal aspects of a given profession and downplay the nerdy stuff. This ends up being only moderately harmful because the women take the intro classes, decide they're not interested for reasons completely divorced from social expectations, and then choose something different. But when it comes to a charitable organization, downplaying the male-coded activities can become a self-fulfilling prophecy: You succeed in attracting women (and men) who think weaponizing  autism into producing good animal suffering metrics is a waste of time, and soon "Effective Altruists" stop thinking animal suffering metrics are worth funding. That sounds pretty bad to me.

(2) ...Thus writing off the idea of being more inclusive to [wo]men seems needlessly dismissive and reductive and leaves a lot of impact and opportunity on the table.

 I certainly didn't write off the idea of being more inclusive. There are obviously more ways to reduce the incidence of sexual harassment besides modifying the gender ratios of an org. But if gender ratios were a significant part of why the person I replied to saw more sexual harassment that would be discouraging for all of the reasons I have outlined thus far. 

(3) If you care about achieving impact on existential risk, malaria, and even the Jalisco New Generation Cartel, it would be very helpful to have a healthy, robust, and impactful community to work on these problems. Being more inclusive to non-men would improve EA on all three of these axes and thus paying attention to at least some claims currently referred to as "woke" or "leftist" would improve on all three axes.

Again I agree.

lc
1y-1
11
26

As has been noted many times, EA is currently about 70% male, whilst environmentalism/animal advocacy is majority women.  I would be fairly confident that a more balanced gender ratio would mean less misogyny towards women. 

My guess is that EA is currently male because aggressively quantifying and measuring charitable giving is an activity that appeals primarily to men. As long as that remains true, and Effective Altruism remains Effective Altruism in that way, my prediction is that the gender ratio will remain the same, just as most hobbies and social groups maintain similar gender ratios over time even when people work really hard to change them. If this form of harassment is inherent to male-dominated activities then that would be pretty sad.

Some EAs have a kind of "anti-woke" sentiment to the point where I actually think it could be fairly damaging e.g. it causes people to think issues related to race, gender, nationality etc aren't important at all. I think it would be pretty valuable if everyone read a few core texts on things like racism, sexism, ableism, etc. to actually understand the every-day experiences of people facing various forms of discrimination and bigotry. 

I'm pretty sure the standard left-American take on everyday harrassment is straightforwardly compatible with believing it's not very important in a world with existential risk and malaria and the Jalisco New Generation Cartel, and that this is a sensible position for EAs to hold even when they're not explicitly "anti-woke".

lc
1y1
26
70

I'm skeptical by default of accusations of sexual misconduct that don't name the perpetrator, even when the source is anonymous, in 2023. That seems to include most of the accusations here. 

Which is not to say definitively that the piece is untrue - everything in it could very well be accurate - just that the way the piece is now, it's essentially set up to do maximum damage to EA while limiting EA leadership's ability to take productive action against the offenders, and that makes me at least suspicious that events have been distorted.

lc
1y3
3
2

It is literally impossible for a charity to follow the law as you have described it, because doing any good whatsoever under your own name opens you up to claims that you have benefited in some material sense,whether that be financially or professionally or reputationally. Charities are well-known to employ people and directly pay them for services rendered, so without being a UK lawyer I don't know what kind of additional context is required for such a case to be prosecutable in practice, but it's certainly a nonzero amount. Granted this sort of completely unpragmatic interpretation of the law, you are probably breaking similar laws yourself, because there is simply too much law to support following a layman's interpretation of all of it.

This is a reality of the world you live in -the one with dozens of countries with hundreds of thousands of pages of law on the books, many designed explicitly to give as much discretion to prosecutors as possible - and so this "attitude" you describe (where people remain willing to do ethical things that prosecutors will not actually prosecute them for) is the only way to live. That is of course unless you're willing to single out particularly high profile organizations - then you can pretty much accuse anybody of breaking the law in some country or another.

lc
1y3
0
0

The joke is that the form of "data collection" you suggest is illegal according to GDPR would also imply my spreadsheet was illegal. Indeed, if recording inferences about the character of people you meet were illegal, any employer making a scoring list of candidates to interview would be breaking the law. This indicts virtually every organization above a current size.

Similarly, in the strict manner in which you cite #1, it's probably also true that every charity in the UK is in violation of UK charity law. Most charities (for example) have employees that are paid materially for their work, that they don't entirely give away to the charity. It's also essentially impossible to avoid "benefitting" entirely, in some way, from successful charity efforts - for example in the form of boosted professional reputation and prestige among a certain class of people. If your reading of UK charity law were correct then charity in the UK would be largely illegal.

The obviously nonsensical legal implications of this post, combined with the gravity of the accusations, combined with the offensive suggestion that this is "whistleblowing" in any good-faith sense of the word, led me to respond with a joke instead of engage with it seriously. I do not regret not doing so, the downvotes someone applied to my entire profile notwithstanding.

lc
1y-3
0
0

It seems really clear that the social network of EA played a huge role in FTX's existence, so it seems like you would agree that the community should play some role, but then for some reason you are then additionally constraining things to the effect of some ill-specified ideology

No, I agree with you now that at the very least EA is highly complicit if not genuinely entirely responsible for causing FTX.

I don't think we actually disagree on anything at this point. I'm just pointing out that, if the community completely disbanded and LessWrong shut down and rationalists stopped talking to each other and trained themselves not to think about things in rationalist terms, and after all that AU-Yudkowsky still decided to rob a bank, then there's a meaningful sense in which the Inevitable Robbery was never "the rationality community's" fault even though AU-Yudkowsky is a quintessential member.  At least, it implies a different sort of calculus WRT considering the alternative world without the rationality community.

lc
1y25
12
3

I don't think it's reasonable to expect them to anticipate this exact scenario, nor do I think they should be spending lots of time planning for tail risk PR scenarios like these instead of being actually productive.

lc
1y7
0
0

It seems really quite beyond a doubt to me that FTX wouldn't have really existed without the EA community existing. Even the early funding for Alameda was downstream of a bunch of EA funders.

Yeah, I guess I'm just wrong then. I'm confused as to why I didn't remember reading the bit about Caroline in particular - it's literally on her wikipedia page that she was an EA at Stanford.

I mean, if Eliezer robbed a bank, I think I would definitely think the rationality community is responsible for a bank robbery (not "LessWrong", which is a website). That seems like the only consistent position by which the rationality community can be responsible for anything, including good things. If the rationality community is not responsible for Eliezer robbing a bank, then it definitely can't be responsible for any substantial fraction of AI Alignment research either, which is usually more indirectly downstream of the core people in the community.

FWIW I still don't understand this perspective, at all. It seems bizarre. The word "responsible" implies some sort of causal relationship between the ideology and the action; i.e., Eliezer + Exposure to/Existence of rationalist community --> Robbed bank.  Obviously AI Alignment research is downstream of rationalism, because you can make an argument, at least, that some AI alignment research wouldn't have happened if those researchers hadn't been introduced to the field by LessWrong et. al. But just because Eliezer does something doesn't mean rationalism is responsible for it any more than Calculus or the scientific method  was "responsible" for Isaac Newton's neuroticisms.

It sounds like the problem is you're using the term "Rationality Community" to mean "all of the humans who make up the rationality community" and I'm using the term "Rationality Community" to refer to the social network. But I prefer my definition, because I'd rather discuss the social network and the ideology than the group of people, because the people would exist regardless, and what we really want to talk about is whether or not the social network is +EV.

Load more