Jenny K E

401Joined Mar 2022

Comments
20

The gender identity question includes options that aren't mutually exclusive; I believe it should either be a checkbox question or should list something along the lines of "cisgender woman, transgender woman, cisgender man, transgender man, nonbinary, other." If you have more questions, feel free to PM me and I'm happy to do my best (as an ally) to answer them.

As someone in a somewhat similar position myself (donating to Givewell, vegetarian, exploring AI safety work), this was nice to read. Diversifying is a good way to hedge against uncertainty and to exploit differing comparative advantages in different aspects of one's life.

Kelsey clarified in a tweet that if someone asks for a conversation to be off the record and she isn't willing to keep it off the record, she explicitly tells them so.

Presumably he made some unfounded assumptions about how sympathetic she'd be and whether she'd publish the screenshots, but never asked her not to.

[ETA: Whoops, realized this is answering a different question than the one the poster actually asked -- they wanted to know what individual community members can do, which I don't address here.]

Some concrete suggestions:

-Mandatory trainings for community organizers. This idea is lifted directly from academia, which often mandates trainings of this sort. The professional versions are often quite dumb and involve really annoying unskippable videos; I think a non-dumb EA version would encourage the community organizer to read the content of the community health guide linked in the above post and then require them to pass a quiz about its contents (but if they can pass the quiz without reading the guide that's fine, the point is to check they understand the contents of the guide, not to make them read it). I imagine that better but higher-effort/more costly versions of this test would involve short answer questions ("How would you respond to X situation?"); less useful but lower-effort versions would involve multiple choice questions. To elaborate, the short answers version forces people to think more about their answer but also probably requires a team of people to read all these answers and check if they're appropriate or not, which is costly.

-some institution (the community health team? I dunno) should come up with and institute codes of conduct for EA events and make sure organizers know about them. There'd presumably need to be multiple codes of conduct for different types of events. This ties in to the previous bullet since it's the sort of thing you'd want to make sure organizers understand. This is a bit of a vague and uninformed proposal -- maybe something like this already exists, although if so I don't know about it, which at minimum implies that if it exists it ought to be more widely advertised.

-maybe a centralized page of resources for victims and allies, with advice, separate from the code of conduct? Don't know how useful this is

-every medium/large EA event/group should have a designated community health point person, preferably female though not necessarily, who makes a public announcement that if someone makes you uncomfortable you can talk to the point person and with your permission they'll do what's necessary to help, and then follows through if people do report issues to them. They should also remind/inform everyone of the role of Julia Wise, and, if someone comes to them with an issue and gives permission to pass it on to her and her team, do that. (You might ask, if this point person is probably just gonna pass things on to Julia Wise, why even have a point person? The answer is that reporting is scary and it can be easier to report to someone you know who has some context on the situation/group.)

Furthermore, making these things happen has to explicitly be someone's job, or the job of a group of someones. It's much likelier to actually happen in practice if it is someone's specific responsibility than if it's just an idea some people talk about on the Forum.

Something I don't think helps much is: trying to tell all EAs that they should improve their behavior and stop being terrible. This won't work because unfortunately, self-identifying EAs aren't all cooperative nice individuals who care about not harming others personally. They don't have incentives to change just because someone tells them to, and worse offenders on these sorts of issues are also very likely to not be the sorts of people who want to read posts like this one about how to do better. That said, I think that posts on this subject that are more helpful are posts that include lots of specific examples or advice, especially advice for bystanders.

I think maybe that the balance I'd strike here is as follows: we always respect nonintervention requests by victims. That is if the victim says "I was harmed by X, but I think the consequences of me reporting this should not include consequence Y" then we avoid intervening in ways that will cause Y. This is a good practice generally, because you never want to disincentivize people from reporting by making it so that them reporting has consequences they don't want. Usually the sorts of unwanted consequences in question are things like "I'm afraid of backlash if someone tells X that I'm the one who reported them" or "I'm just saying this to help you establish a pattern of bad behavior by X, but I don't want to be involved in this so don't do anything about it just based on my report." But this sort of nonintervention request might also be made by victims whose point of view is "I think X is doing really impactful work, and I want my report to at most limit their engagement with EA in certain contexts (e.g., situations where they have significant influence over young EAs), not to limit their involvement in EA generally." In other words, leave impact considerations to the victim's own choice.

I'm not sure this the right balance. I wrote it with one specific real example from my own life in mind, and I don't know how well it generalizes. But it does seem to me like any less victim-friendly positions than that would probably indeed be worse even from a completely consequentialist perspective, because of the likelihood of driving victims away from EA.

Yes, I agree with what you've written here. "This comes from a place of hurt" was actually meant as hedging/softening; "because you have had bad experiences it makes sense for your post to be angry and emotionally charged and it should not be held to the same epistemic standards as a typical EA Forum post on a less personal issue." Sorry that wasn't clear.

My response was based on my impressions from several years being a woman in EA circles, which are that these issues absolutely do exist and affect an unfortunately high number of women to various extents, and also that some of what's described in this post is atypically severe. (Obviously, none of this should ever happen, to any degree of severity, and I really want to see EA get better at preventing it!) Originally, I wasn't clear on the fact that the post was written as a personal report of harm experienced, and that its descriptions of the severity were not intended as universal claims about what is typical. The author has now made a number of edits which make the scope/intent of the post much clearer, thereby obviating much of my comment.  I agree that the idea of "endorsing" someone's report of their own experience is not useful for the reasons you describe, and on further reflection I do want to be more careful in future to respond to reports of harm in ways that don't disincentivize reporting -- that is the last thing I want to do!

Yep, you are totally right about availability bias and I don't mean to downplay at all your experience -- that's awful and I'd be delighted to see more efforts by EA groups to prevent this sort of thing.

And yeah, if you don't feel like optimizing for argumentative quality that's valid and my comment isn't worth minding in that case! Not your job to fix these issues, and thank you for taking the time to bring awareness.

[Epistemic status: I've done a lot of thinking about these issues previously; I am a female mathematician who has spent several years running mentorship/support groups for women in my academic departments and has also spent a few years in various EA circles.]

I wholeheartedly agree that EA needs to improve with respect to professional/personal life mixing, and that these fuzzy boundaries are especially bad for women. I would love to see more consciousness and effort by EA organizations toward fixing these and related issues. In particular I agree with the following:

> Not having stricter boundaries for work/sex/social in mission focused organizations brings about inefficiency and nepotism [...]. It puts EA at risk of alienating women / others due to reasons that have nothing to do with ideological differences.

However, I can't endorse the post as written, because there's a lot of claims made which I think are wrong or misleading. Like: Sure, there are poly women who'd be happier being monogamous, but there are also poly men who'd be happier being monogamous, and my own subjective impression is that these are about equally common. Also, "EA/rationalism and redpill fit like yin and yang" does not characterize my experiences within the EA movement at all. I'm sure there are EAs who are creeps that subscribe to horrible beliefs about gender, but the vast majority of EAs I know are not like that at all. In a similar vein, regarding the claim "many men, often competing with each other, will persuade you to join polyamory using LessWrong style jedi mindtricks while they stand to benefit from the erosion of your boundaries" -- I completely agree that this is absolutely awful if/when it happens, but I also think this is a lot less common than this post makes it sound.

Overall, the post seems to do a mixture of pointing out legitimate problems and making angry overarching accusations that I don't think are true. I believe this post comes from a place of hurt, and I am sincerely sorry that you've had such negative experiences. I really do want the EA community to improve at this, and I want the people who've given you such bad experiences to be appropriately dealt with so that they don't harass others in future. However, I don't think this post as written will help much, because the overarching accusations are likely to turn people off from taking the rest of the post seriously.

[ETA: Wanted to add that the supportiveness and collaborative brainstorming suggested in the thread above by Megan, Keerthana, and Rockwell totally do seem helpful and productive to me, and I am excited to see this happening.]

[Second ETA: This comment was written in response to an earlier version of this post. Since then the author has made several edits which make what I've said here somewhat irrelevant.]

I'd suggest "LEA," which is almost as easy to type as EA.

Thanks so much for writing this. As someone interested in starting to do community building at a university, this was helpful to read, especially the Alice/Bob example and the concrete advice. I do really think that EA could stand to be less big on recruiting HEAs. I think there are tons of people who are interested in EA principles but aren't about to make a career switch, and it's important for those people to feel welcome and like they belong in the community.

I was going to write "I kind of wish this post (or a more concise version) were required reading for community builders," and then I thought better of it and took actions about it -- namely, sent the link as feedback to the EA Student Group Handbook and made an argument that they should incorporate something like this into their guide for student groups.

Load More