Vilfredo's Ghost

461Joined Dec 2018


  • Attended an EAGx conference
  • Attended more than three meetings with a local EA group
  • Received career coaching from 80,000 Hours


I think this is something that mostly needs to be left up to individual organizations, and the media's framing of "EA has a sexual harassment problem" is really misleading. It should be "Organizations X and Y have a sexual harassment problem"; if people didn't want to name specific orgs then it never should've been published, and if people are going to try to tar others who were uninvolved that should be treated as the dishonest garbage it is. The media coverage and the community debate on this have been like if someone said "Democrats have a sexual harassment problem" and tried to paint Obama as a rapist based on what Clinton did. 

Certainly employers do have an interest in their employees' romantic relationships in the examples you cite and have a right to limit them. But I don't think you can make a blanket rule that works community wide; informal power is often more important than formal power,  especially in a small community, and if you start limiting relationships where there's even informal power dynamics you get either infinite complexity or a total ban on intra-community relationships, neither of which is healthy. Individual employers should make their own decisions about HR policies and people can make their own decisions about how much protection they want. 


Now, on an individual level I think a lot of people should be thinking more about how their relationships/hookups limit their ability to do the most good they could do, and should take a hard look at whether being able to sleep with whoever they want is really worth the losses it may cause in their effectiveness. This is true for all the reasons you cited that an employer may have an interest, but ALSO because public perceptions of them/the community may matter, and for long-term relationships it goes even beyond that because you need to think about the sacrifices people sometimes have to make for their SO. Who is supposed to take the career hit if one of you gets a great job offer far away and the other doesn't have anything comparable to/better than their current job available there? For an EA dating a non-EA, the solution is you demand that your career take precedence and you do everything in your power to make it up to them somehow, but for an EA dating another EA who is approximately their equal in ability and dedication (and presumably you're dating your equal...),  you've created a dilemma that you could have avoided with different relationship choices. 


Side note: "Hookups within a military unit" is an interesting example because those are mostly permitted, and not just in ancient Greece. At least when I was in service, the rule was  no sleeping with anyone in your direct chain of command and no officers sleeping with enlisted even not in chain of command. Now, maybe this is a bad idea; the military does have a sexual assault problem and perhaps you'd reduce that by saying no one in the same platoon/company/whatever can sleep together, period. But that's not the established rule. 

I think you're just playing in to a broader cultural problem here. Too many younger EAs are too invested in getting a job at an EA organization, and/or in having the movement as a part of their identity (as distinct from the underlying ideal). If you think the movement has serious flaws that make it not a good means for doing the most good, then you should not be trying to work for an EA org in the first place, and the access to those opportunities is irrelevant.  


People should not be using the movement for career advancement independent of the goal of doing the most good they can do with their careers (and in most cases, can't do that even if they intend to, because EA org jobs that are high-status within the movement are not similarly high-status outside of it).


I find the EA movement a useful source of ideas and a useful place to find potential collaborators for some of my projects, but I have no interest in working for an EA org because that's not where I expect I'd have the biggest impact. I think the movement as a whole would be more successful, and a lot of younger EAs would be a lot happier, if they approached the movement with this level of detachment. 

I asked for clarification the first time around, in addition to providing copious information about my involvement. There is no further information to provide. At this point they should admit or reject, not ask for further edits. Yes, I am sure it's burdensome for the reviewing team if they are creating extra work for themselves by not just making a decision, but that's a burden created by their poor work process, not by the task itself. 

awkward is pretty mild as far as ways to be emotionally stupid go. If that's all you're running into then EAs probably have higher than average emotional intelligence, but perhaps not as high in relative terms as their more classically defined intelligence

Seems unlikely for these examples. It's not the scientific discovery that really matters; it's the public health program implementing it, which is a lot more sensitive to pre-existing conditions than discovering a fact about the world is. 

why not? smallpox might or might not have died out, but hookworm would still be around

I think this response is fully accounted for by adjusting editing time based on the importance of the work, as stated in the post. 

If it's only ~as important as your normal daily work, and you have to do 5 drafts to make it better than existing work on the topic, it's probably not something you should write at all. Do something that will make a unique contribution on the first draft. 

oh yeah lots of opportunities in nj right now. Won my first two bets but I'm limited by the fact that I didn't plan in advance, and didn't have paypal connected to my bank. My bank's not allowing me to put enough money in and paypal will take several days to get connected. So fyi for anyone trying this, make sure your paypal account is funded in advance. 

I came to the basic idea of EA, long before I found the movement, from a Christian perspective. So I think there's certainly the basis for it in a lot of religions. But I think at that point I was more devout than most Christians, even most of those who go to church every Sunday. This is probably a key factor.

 I'm not sure how seriously most people take any of their goals, even the selfish ones. Lack of commitment is a hell of a thing, and even more so when mental effort and uncertainty are required.  It kind of astounds me how often people say they want something and then don't follow through at all on even minimal efforts. A friend wanted a job in my field, so I introduced him to a connection in his area. He never met with her. Other friends have run for office, but then not bothered talking to any voters. A relative repeats the same financial mistakes over and over and over again despite my attempts to help her with financial planning and her swearing up and down each time that next time will be different. 

And all of these personal goals are a lot more straightforward to sort out than "how do I do the most good I can  do?". I could figure out a plan for all of these examples in an afternoon at most, and after years of effort I still don't know how to be a maximally effective altruist. Most people, when they can't round uncertainty off to "yes" or "no", seem to have this idea that it's uncertain so all actions are the same. I recently had a conversation with an acquaintance who accused me of "only thinking in black and white" because I believe with a high degree of confidence that donating to AMF is a better choice than randomly paying for groceries for the person behind you in line, "because maybe they need it and maybe the kindness will ripple through the world and have other effects".  And several other people witnessing this debate agreed with him!


So in addition to altruism, I think key personality traits that would be necessary for someone to be even an alt-EA are an abnormally high level of goal-commitment, and an unusually high level of comfort making decisions under uncertainty. 

Load more