sabrinac

@ Brown EA
427Providence, RI, USAJoined Jul 2020

Bio

Brown EA President

Comments
24

I agree that it would be better if EA had a more balanced age distribution. Anecdotally, I don't think we have enough mentorship opportunities and mid-career professionals to support the recent influx of young people into EA. I think it would also create a better epistemic culture if the median person in EA has deeper expertise in a field/skill-set as opposed to having lots of young people narrowly following 80k's advice. 

I don't buy the claim though that we shouldn't focus on student outreach. I think the debate over student outreach is much more about how it's done rather than if it should be done

I agree with you too that most students shouldn't necessarily be organizing their group. FWIW at Brown, pretty much everyone on our leadership team is prioritizing skilling up (through part-time research positions, classes, internships, etc.) alongside doing ~5 hours a week of group organizing. 

The age distribution problem could also self-correct itself over the next 10 years? I think we should mostly be concerned if the fraction of young people in EA continues to grow faster than the fraction of older age groups. I also think we've been grabbing a lot of the low-hanging fruit at top universities and this will start to stabilize in the next few years (i.e. there's a finite number of people who would be really into EA at top universities). I could be wrong though if we continue to have exponential growth of new uni groups.

Thanks Evie :) The quote you linked also really resonates with me. 

Thanks for offering this! Unfortunately, I think it's hard to make blanket recommendations about which roles people should apply for. A lot of career advice has to be given on an individual basis, and I'm wary of recommending that people apply to another category of jobs without having strong reasons why they're applying to those jobs. 

Hi, thanks for this! Yeah, to clarify, when I say "ops bottlenecks" I was mostly referring to last-minute logistical problems that come up when people run events or programs. (Not other things that fall under the category of "operations" like creating efficient systems within an organization.) I agree that the latter ops bottlenecks are hard to solve! 

Hey! Thanks for writing this—similar to other comments, I wanted to highlight that there's a lot of work going on in this space and updates will be published soon. After EAG SF, several EA comms professionals are meeting to coordinate and figure out which comms projects/organizations should get off the ground. We hope to publish a long list of project ideas soon. 

Here's a list of existing efforts I know of to counter this problem: 

  • The EA Communications Fellowship is helping early-career EAs skill up and prepare for roles in comms. We're hoping to incubate a couple of projects after this summer.
  • A new EA marketing agency is getting off the ground
  • Upcoming publicity for WWOTF (in major publications, podcasts, etc.) should increase positive coverage of longtermism
  • Various orgs are conducting message-testing and polling on public perceptions of EA / which framings are most compelling
  • Members of the EA community are working with a PR firm on EA's comms strategy 
  • There are new publications (e.g. Asterisk) starting  
  • Comms directors at EA orgs are cultivating relationships with journalists/other public figures 
  • Overall, it's worth noting that key people are definitely aware of this problem and actively working on it. Most of these efforts started in the past 6 months and therefore aren't very visible yet in the broader community. 

I really love these books. Most of them aren't hardcore EA-ish and mostly cover fringe ideas (like what would happen if AI became sentient), but there are few stories/characters in each book that might be relevant. Best of luck with the talk!  :)

  • The Paper Menagerie by Ken Liu
    • collection of sci-fi short stories 
  • The Hidden Girl and other Stories by Ken Liu
    • collection of sci-fi short stories
    • There's one story woven throughout the book about people uploading their consciousness to a digital world (and how different this world would look like). 
  • Exhalation by Ted Chiang
    • collection of sci-fi short stories
    • "The Lifecycle of Software Objects" explores what it would be like if we created sentient, digital beings. I think it drives home the idea of expanding your moral circle. 
  • Stories of Your Life and Others by Ted Chiang
    • collection of sci-fi short stories
  • Station Eleven by Emily St. John Mandel 
    • The book takes place in a post-pandemic world where >99% of humanity died and describes how we could rediscover meaning/beauty in such a world. 
  • I'm also sure you've gotten this rec a bunch of times, but the Three Body Problem + sequels. 

Hi Luke!

Thanks so much for your thoughtful response. Socioeconomic inclusion doesn't get enough attention in EA and I'd hate for attempts to prevent mitigating rent-seeking behavior to turn into raising barriers for low-income community members.

I think that the questions of how to mitigate rent-seeking behavior and make EA more socioeconomically inclusive can largely be decoupled though. I agree with @levin that it's easy for group organizers to identify when people are motivated in good faith vs. bad faith to get funding to attend a conference. I also don't think that attempts to identify people acting in bad faith necessarily lead to socioeconomically excluding people. Rather, there aren't enough clear resources and opportunities for low-income community members, and most of the community's resources are targeted at affluent, elite students and universities. To me, these seem like much larger barriers to inclusion than the dialogue on funding in EA. 

While the risk of rent-seeking behavior may be overinflated compared to the actual risk, I also think this is difficult to claim. First of all, I think that the most serious incidents of rent-seeking behavior won't be public knowledge in EA. Instead, the community health team and other individuals will likely deal with these incidents privately. Second of all, I've heard of a couple of incidents that **really concerned me** regarding how some individuals accessed large sums of money for self-motivated and manipulative reasons. EA orgs and individuals definitely have safeguards in place, but I think the high-trust nature of the community at times allows people to access a lot of money with minimal oversight. I'd personally defer to the community-health team and grantmakers in assessing the scope of the risk.

I also think there are good second-order reasons to also want to prevent rent-seeking behavior in EA (even if the first-order effect of funding a rent-seeking person isn't that bad). Maintaining a high-trust community makes it a lot easier to get stuff done and fund people. There are a lot of PR risks from individuals accessing money for personal gain. And I want to avoid spreading the meme of "EA will fund students' vacations" on college campuses, giving EA groups a bad reputation. 

Hi, thanks for writing this! I've shared a number of your concerns, especially around the use of "HEAs," copy-paste community building efforts leading EA to become more intellectually homogenous, and EA group organizers viewing people instrumentally. However, I think you underestimate the variance in community-building efforts and the degree to which many community-builders are aware of these problems and actively trying to mitigate them.  I'm not sure how many university-group organizers you've engaged with, but I'd be curious if you thought these problems were less severe than you claim after talking to more of them. Here are a few of my thoughts on specific things you wrote: 

Community-building is coercive:

  • Most of community-building doesn't entail reading prepared scripts or tracking people in CRMs. While I have copied intro talk templates before, we stopped running those this semester after attendance was low and found talking to people casually at an intro event to be a lot more engaging. We do share fellowship materials with other groups, but most of the fellowship is oriented around fellows engaging with the material, not facilitators coercing anyone.
  • I think very few groups in practice are actually trying to explicitly identify/create HEAs. First of all, the definition of HEA is too vague to be useful when running a group. Second of all, I think a lot of community-builders (myself included) have an aversion to binarily labeling people as an HEA or non-HEA.  (To be frank, I wouldn't mind getting rid of the term altogether.)
  • I view a lot of my community-building efforts as genuinely trying to build a community. I think effective community-building often starts with meeting the person as an individual and trying to help them along their path to having an impact, rather than pushing them into certain predefined paths.

Community-builders are too new to EA: 

  • I agree this is a big issue and I'm wary of tasking people who've just heard about EA to run their student group. I think it's a lot easier for young community-builders to defer to other people's judgments instead of forming their own inside views, and it's difficult to build epistemically healthy communities without understanding the arguments yourself.
  • I'm hopeful that the increase in workshops/retreats for community-builders will help mitigate this, especially if they prioritize improving their reasoning. However, I also expect this problem to become more salient as the number of EA groups rapidly scales, coordination becomes more difficult, and support systems for groups don't scale quickly enough. 

People think EA is a cult: 

  • I honestly haven't heard this said frequently at Brown. Maybe it's due to selection effects and me not engaging with the people who think EA is culty, but most of the criticisms I've heard second-hand are that EA is too demanding, doesn't care enough about local communities, or has an overly strong world-view.  The only people I've heard call it culty are close friends of mine who aren't EAs but are confused about what community-building is / why EA sends students to free conferences.   
  • I think the degree to which people see EA as a cult likely comes from their interactions with their EA group's leadership. Some groups have a stronger "EA is everything" vibe, while other groups have more diverse interests and ethical commitments.  

EA groups have poor epistemics: 

  • This is my biggest concern. Pretty much everyone in a university EA group is between the age of 18-22 with limited real-world experience, and I think the median college EA is too quick to defer to the advice of "experts" and high-status people in EA. 
  • Despite my concern, I'm also pretty optimistic about mitigating this problem. A successful EA group should create a culture that encourages disagreement and debate and attracts philosophically-minded people who evaluate arguments based on their merit. There are also more proactive ways to do this, such as running red-teaming events, cause prio workshops, etc. 

One caveat: A lot of these thoughts come from my personal experience, so maybe I'm anchoring too much on my model of community-building and how I choose to run Brown EA. However, I've talked to dozens of community-builders over the past year and the majority of them seem socially astute and cognizant of these pitfalls. That being said, I do think we can do significantly more to improve the epistemics of university groups/community-builders and avoid EA coming across as culty. FWIW, I'm planning on redesigning Brown EA's intro program over the summer to focus more on introducing people to EA reasoning/principles rather than conclusions/cause areas. 

big +1!  Thanks for writing this. 

I think also announcing summer internship opportunities in the fall/winter, even if applications don't open for a few months, would also be helpful so students know which EA orgs are planning on offering internships. 

Thanks for flagging that—should be fixed now. 

Load More