596Downtown, Eugene, OR, USAJoined Jul 2019


EA has copped a lot of media criticism lately. Some of it (especially the stuff more directly associated with FTX) is well-deserved. There are some other loud critics who seem to be motivated by personal vendettas and/or seem to fundamentally object with the movement's core aims and values, but rather than tackling those head-on, seem to be trying to simply through everything that'll stick, no matter how flimsy. 

None of that excuses dismissal of the concerning patterns of abuse you've raised, but I think it explains some of the defensiveness around here right now.

It sounds like you want to engage constructively to reduce abuse in the community, and I appreciate that. The community will be stronger in the long run if it can be a safer and more welcoming space.

I know we're a bunch of weirdos with a very specific set of subcultural tics, but I hope everyone appreciates your efforts to help. I think people here really are unusually motivated to do good and there is a lot of goodwill as a result. On the other hand, I think a lot of that is ego driven. And it's a very nerdy culture, male-dominated and probably many people here have a predictable set of blind spots as a result.

Wish I had more to say, or could do more to help, but I'm not in the bay area, don't work in tech, and don't have very much context for the cultural problems you're encountering.

Thank you for your explanation. I appreciate you taking the time to explain your reasoning on that point and find it useful for being confident in the rest of what you have to say here.

I don't mean this as a comment on the particular case reported in the TIME article, though I'd reject using naive base rate calculations as a last word on someone's probabilistic guilt. but the "only 2-3% of allegations are false" stuck out to me because I read a better estimate is probably more like 2-10%.∞/ there's a lot of ambiguity here--issues like not every "report" is an "allegation" because sometimes reports don't name a perpetrator. I have no idea what the correct figure is, but it seems to me the 2-3% figure gets probably bandied around a lot probably with a sense of precision and finality that isn't warranted by the evidence. Happy to see new evidence or information to the contrary, and whether the rate is 2% or 10% it can certainly be described as "low".

The article is a nearly a decade old, and for all I know there might be newer research. But I hope when people are thinking about best practices for the future, they do so on the basis of the best evidence available.

I agree with you that mixing romantic relationships with professional ones occurs among people who are monogamous or don't identify as polyamorous.

I personally wouldn't like to see EAs discouraged from being polyamorous. I'm not actively polyamorous myself, but I wouldn't want to see people restricted to more traditional romantic styles, like monogamous marriage, because I think many of those relationship styles developed in a very different social and technological context than the context we have now. Our culture at large probably benefits from people pioneering and exploring relationship styles that are more suited to our current sociotechnological context. In addition, I think it would be somewhat of a human rights issue for, say, employers in the movement to be telling people how to order their romantic lives.

That said, what I mean to say was that if your romantic life involves more people--which I think it can in polyamory (that is the aim for many, perhaps!), you'll have a larger and more complex web of romantic connections. If those people are also those you have professional connections with, then there is the potential for your professional and romantic webs to overlap. While this can also happen to people in monogamous relationships, to the extent they are romantically involved with fewer people, and their romantic networks are smaller, there's less potential for an overlap of their professional and romantic lives. 

And while I think an interaction of professional and romantic lives isn't inherently wrong, it facilitates conflicts of interests, and even more importantly, power dynamics that can facilitate abuse or coercion.

I don't want to dismiss deep platonic friendships, but I'd ask to agree to disagree on the relevance of those, perhaps, because just about by definition, they will not involve physical or sexual abuse, and so those aren't consequences of coercive power dynamics that might arise in platonic friendships.

I think polyamory as it is described in the articles mixes complex webs of personal relationships with professional ones. Romantic connections within polyamorous communities can be complicated even without entanglement with professional concerns. When you bring in layers of professional connections on top of that, I can see why there might be an extra dimension of vulnerability to coercion and exploitation.

A simple back-casting or systems-mapping exercise (foresight/systems-theoretical techniques) would easily have revealed EA’s significant exposure and vulnerability (disaster risk concepts) to a potential FTX crash. The overall level of x-risk is presumably tied to how much research it gets, and the FTX crash clearly reduced the amount of research that will get done on x-risk any time soon. 

This is not the first time I've heard this sentiment and I don't really understand it. If SBF had planned more carefully, if he'd been less risk-neutral, things could have been better. But it sounds like you think other people in EA should have somehow reduced EA's exposure to FTX. In hindsight, that would have been good, for normative deontological reasons, but I don't see how it would have preserved the amount of x-risk research EA can do. If EA didn't get FTX money, it would simply have had no FTX money ever, instead of having FTX money for a very short time.

I was inspired by your post, and I wrote a post about one way I think grant-making could be less centralized and draw more on expertise. One commenter told me grant-making already makes use of more expert peer reviewers than I thought, but it sounds like there is much more room to move in that direction if grant-makers decide it is helpful. 

At a pinch, I would say review might be more worthwhile for topics where the work builds on a well-developed but pre-existing body of research. So, funding a graduate to take time to learn about AI Safety full-time as a bridge to developing a project probably wouldn't benefit from a review, but an application to develop a very specific project based on a specific idea probably would.

I don't have a sense on how often five-to-low-six-figure grants involve very specific ideas. If you told me they usually don't, I would definitely update against thinking a peer review would be useful in those circumstances.

Thank you, David! From what you've said here it seems clear my post was missing critical information.

I'm not sure this post literally could have been much better researched, conditional on me writing it. I don't feel entitled to contact funders to ask them about their process (perhaps I should feel free to? I'm not sure). EA Funds website mentions briefly they "engage expert-led teams of subject matter experts" in their decision-making, and that's something I should have researched first and mentioned, but also, I think that gives away so little information that I learned more from your reply here than I would have from reading that.

Perhaps other funders describe their process in more detail, I don't know, and if so, I concede that's something I could have identified before writing the post.

So the only other way I can see this post could have had more information is that I could have asked more widely with people more familiar with the process than I am. But I'm not personally acquainted with anyone who I knew would know more about the process.

Or, finally, I could have left it to someone else to write, but then, if they didn't, I wouldn't have learned from you that grant-makers already engage in expert consultation.

It probably is obvious to you, considering your experiences. But the way grant making was described on the Doing Good Better post last week--something to the effect of "it helps to move to the bay and make friends with grant-makers"--suggests to me the process is pretty opaque to a lot of other people, not just to me, and so I suppose I'm glad I opened a conversation even if I don't have a lot of insight to share on the process.

Edit: There is a point I'm trying to make, other than defending my own process, which is that the process in general is fairly opaque, and if the information you're talking about is publicly available, I'm not aware of it! And that validates something of the transparency critiques from the DGB post last week.

Load More