SR

Sarah Reed

134 karmaJoined May 2022

Comments
4

Thanks for the lead! The post you linked seems perfectly suited to me. I'll also contact Ben Snodin to inquire about what he may be working on around this matter.

"There are numerous ideas, opportunities, methods, that are going un-noticed because of the barriers placed in front of thoughtful dialogue. It is a burden that should rest upon those EAs who are dismissive of deeper conversation, instead of being the "price I have to pay, to prove myself, before anyone will listen", as I was most recently told on this Forum."
 

Your last paragraph is exactly what I'm worried about when considering engaging EA and exactly why I bring up "signalling" and "posturing" in my own post. I worry about the maturity of the community,  and the seriousness EA has about actually getting things done as opposed to being self-congratulatory on their enlightened approach.  I think most seasoned professionals don't have the patience for this kind of dynamic. However, I've yet to determine for myself the extent that this dynamic actually exists in the community.

Hi, thank you for starting this conversation! I am an EA outsider, so I hope my anecdata is relevant to the topic. (This is my first post on the forums.) I found my way to this post during an EA rabbit hole after signing up for the "Intro to EA" Virtual Program.

To provide some context, I heard about EA a few years ago from my significant other. I was/am very receptive to EA principles and spent several weeks browsing through various EA resources/material after we first met. However, EA remained in my periphery for around three years until I committed to giving EA a fair shake several weeks ago. This is why I decided to sign up for the VP.

I'm mid-career instead of enrolled in university, so my perspective is not wholly within the scope of the original post. However, I like to think that I have many qualities the EA community would like to attract:

  • I (dramatically) changed careers to pursue a role with a more significant positive impact and continue to explore how I can apply myself to do the "most good".
  • I'm well-educated (1 bachelor's degree & 2 master's degrees)
  • As a scientist for many years, I value evidence-based decision-making and rationality, both professionally and personally.
  • I have professional experience managing multiple projects with large budgets and diverse stakeholders. This requires 3 of the six skills listed in the top talent gaps identified on your Leadership Forum (as mentioned in the original post).
  • As a data scientist, I have practical & technical expertise in machine learning (related to the last talent gap in the list mentioned above).
  • I'm open-minded. (No apparent objective evidence comes to mind. I suppose you'll have to talk to me and verify for yourselves. :-) )

If we agree that EA would prefer to attract rather than "turn off" people with these qualities, then the following introspections regarding my resistance to participating in the movement may be helpful:

  • The heavy, heavy focus on university recruiting feels .... off.
    • First, let me emphasise that I understand all the practical reasons for focusing on student outreach. @Chris Long does a great job listing why this is an actionable, sensible strategy in this thread. I understand and sympathise with EA's motivations. My following points are from an "EA outsider" perspective and others who may not care enough to consider the matter more deeply than their initial impression.
    • Personally, 'cult' didn't immediately come to mind despite being a common criticism many of you encountered. Still, the aggressive focus on recruiting (primarily young) university students can seem a bit predatory. When there is a perceived imbalance in recruitment tactics, red flags can instinctively pop up in the back of people's minds.
  • The EA community seems homogeneous - and not just demographically.
    • The homogeneity is a natural consequence of the heavy focus on university outreach. Whenever I encounter EAs, I'm generally the oldest ... and I'm only in my 30's! (Is there a place in this community if you're not fresh out of uni?) The youthful skew of the community contributes to an impression that there is a small group of influential figures dictating the vision/strategy of the movement and a mass of idealistic, young recruits eager to execute on it. People who get things done want to find other people who get things done. It's not reassuring if it feels like the movement is filled with young (albeit talented & intelligent) people who can barely be trusted with leading student groups (requiring scripts, strict messaging, etc.).
    • Since the aggressive university outreach focuses on prestigious institutions, the group can seem elitist. Again, I understand the cold realities of this world mean that there are practical considerations for supporting this approach. As an outsider, it isn't easy to discern if the pervasive mentions of top institutions are for practicality or for signalling. I also understand the importance of epistemic alignment. However, when the EA Global application requirement (as an example) is juxtaposed alongside aggressive recruitment at top universities, it starts to seem like EA is looking for "the right kind of people" to join their club in a less benign sense. Admittedly, I have a giant chip on my shoulder from my upbringing on the wrong side of the socio-economic tracks. Even with that self-awareness (and a Berkeley degree), some of my hesitancy to engage is the concern that my value to the community would not be judged mainly on the merit of my contributions but rather on my academic pedigree. I value my time and energy too much to play those games.
  • Breaking into the hive mind
    • EAs seem uniformly well-informed and studied on a core body of seminal studies, books, websites, and influential figures. Objectively, it's a credit to your community that there is such high engagement and consistency in your messaging. To an outsider, it feels like a steep learning curve before being considered a "real EA". (Is there an admission exam or something? Do I need to recite Peter Singer from memory? :-) ) This is more of a compliment than anything. Maybe just be mindful of what you're trying to achieve when you name-drop, cite a study, or reference philosophy terminology in conversation. Is the motivation in doing so to communicate clearly or to posture? At best, EA newbies may feel intimidated. At worst, they/we may get defensive.
    • To a natural sceptic and critical thinker, the uniformity also feels a little like indoctrination. What are the areas of active constructive disagreement? Does the community accept (or even encourage) dissenting (but well-reasoned) opinions? What are the different positions? What are the caveats of the seminal studies? It's not apparent on the surface, and free-thinkers are generally repelled at the notion of conformity for the sake of belonging. (In the "Intro to EA" Virtual Program syllabus, I noticed that there is attention to EA critiques. I'm looking forward to experiencing how that conversation is facilitated.)
  • Does EA care about anything other than AI safety nowadays?
    • I've read about all these significant EA initiatives tackling malaria, global poverty, factory farming, etc., during my first exploration of the movement a few years ago. But nowadays, it seems that all I hear about is AI safety. Considering how challenging it is to forecast existential risk, are you really so confident that this one cause is the most impactful, most neglected, and most tractable that it warrants overshadowing all the other causes? I agree that AI safety is an important cause that deserves attention. However, the fervour around it seems awfully reminiscent of the "Peak of Inflated Expectations" on the Gartner Hype Cycle. It's not so much that I have anything against AI safety, in particular. The impression of "hype" itself is not a great look if someone is seeking a community of critical thinkers to engage. Combined with the homogeneity of the community, it makes me suspicious of "group think".

I want to explicitly state that I know that not all of these impressions are entirely true. I know that EAs aren't all out-of-touch, pretentious jerks. The 80,000 hours job board has several postings across many cause areas aside from AI safety. The impressions described above are primarily from my perspective before actively trying to vet my concerns. However, I imagine that others who share these impressions don't bother to validate their concerns before dismissing the movement.

So why did I go through the trouble of digging deeper? Well, probably because EA is the closest I've found to a community consistent with my own values, motivations, and interests. Despite my reservations, I really want my concerns to be wrong and for EA to work. More importantly, I've grown to trust the values, motivations, judgement, and competency of my significant other, who is committed to EA's mission. Through him, I've met other EAs who are also great people. Quality people tend to attract other quality people. For this reason, @Theo Hawking's imperative to pause and reflect on a)what EA considers a quality conversion and b)if current EA practices are attracting/repelling quality conversions is a worthy exercise.

On a final note, I suspect the comments about the free books or 10% tithing to charity heard from people to explain their "cult" label of EA are merely convenient justifications and don't address the core of their impression. After all, why would they bother investing effort to pinpoint and articulate the sources of their general negative feeling about the movement if they're already disengaged? I suspect that the "cult" feeling has more to do with the homogeneity and "group think" concerns I described above. To combat these negative impressions, I'd recommend:

  1. Diversify your recruitment tactics. I particularly liked the suggestion about recruiting around specific cause areas mentioned by @Jamie Bernardi. I suspect this will also help with your talent gaps. Representation at adjacent conferences/events would also be a channel to reach established professionals. As I was exploring how I might do the most good before I heard of EA, I attended many events like the Data for Good Exchange 2019 (bloomberg.com) and would have been very receptive to hearing about EA there.
  2. Emphasise the projects and the work. @Charles He hit the nail on the head. I would go even further than just aiming to have the best leaders in cause areas. Are EA orgs/work generally respected and well-regarded by other players in the cause area? In other words, does EA "play well with others", or are you primarily operating in your own bubble? Suppose EA is objectively and demonstrably doing great work. In that case, other major players should be open to adopting similar practices and further magnifying the impact. If that's not happening, does EA have the self-awareness to understand why and act upon it?
  3. In conversations with outsiders, favour tangible issues/outcomes and actionable ideas instead of thought experiments. (My perspective skews to the practical, so feel free to discount my emphasis on this point depending on your role in EA.) If the aim is to get more people excited about doing the most good, then describe the success of the Against Malaria Foundation or the scale of impact specific government policies may have rather than using the trolley problem to discuss utilitarianism. Yes, thought experiments are both fun and valuable, but there is a time and a place.
  4. Be accepting of varying styles of communication around ideas and issues. Not everyone interested in cause areas or doing "the most good" will be fluent in philosophy or psychology. If we can communicate concepts, thoughts, or ideas reasonably and productively, it's often unnecessary to derail the conversation on a pedantic tangent. Don't treat me like an unenlightened pleb if you have to explain the connection for why you name-dropped a researcher I hadn't heard of during our conversation. (This is somewhat tongue-in-cheek. :-) )

I hope my diatribe will be received constructively because I am invested in seeing EA succeed regardless if I consider myself an EA at the moment. Anecdata is not rigorous, so who knows how generalisable my data point is. However, upon reading this thread, I realised that my complicated disposition towards EA is not uncommon and decided to share my viewpoint. Whatever that's worth. :-)