[This was partially inspired by some ideas of Claire Zabel's. Thanks to Jessica McCurdy, Neel Nanda, Kuhan Jeyapragasan, Rebecca Baron, Joshua Monrad, Claire Zabel, and the people who came on my Slate Star Codex roadtrip for helpful comments.]
A few months ago, some EAs and I went on a trip to the East Coast to go to a bunch of Slate Star Codex meetups. I'm going to quote that entire post here (with a couple edits):
Our goals are:
- to meet promising people at the SSC meetups and move them into the EA recruiting pipeline
- to spend some time with promising new EAs, eg those at student groups, in the hope that spending a few hours of focused one-on-one time with one of us will help get them more into EA. Like, I think 80K finds people who are excited about AI safety stuff but aren't very knowledgeable about it yet; I think that those people can maybe get a lot out of a few hours' conversation with a few people who have worked professionally on this stuff.
- to visit EAs who are "in holding" doing things like PhD or EtG tech jobs, with possible good outcomes being that they'll be fired up wrt EA and more likely to do really impactful EA stuff on a timescale of like a year, or that their improved (Bay Area/professional EA) connections make it easier for them to spot good opportunities or move into doing more impactful work.
- (less primary) to talk to hardcore EAs and swap arguments and get to know each other better
Here's why I think it's worth us talking to various promising new EAs and enthusiastic EAs who haven't worked in the EA scene full time:
- There are a lot of accumulated arguments about EA topics which I think it’s really helpful to think about but which are hard to access when you only know EAs on the internet, because those arguments haven't been written up clearly or at all, or because their writeups are hard to find and rely on background knowledge that you don't know how to acquire.
- A lot of the time, EAs present versions of arguments that are strong enough to convince you to tentatively think that it's worth engaging seriously with the possibility that the conclusion might be true, but which have a bunch of holes in them that require substantial thinking to fill. Sometimes EAs (eg me) make the mistake of conflating these two levels of strength of argument, and act as if people should be persuaded by the initial sketch. One way that I notice when I'm making this mistake is by getting in arguments with people who've thought about stuff more than me. I hope that talking to more knowledgeable EAs might help some of the EAs we hang out spot holes in their understanding that might help improve their understanding and their epistemics.
Here is a reason that I think that having SF Bay Area EAs talk to rationalists in these cities at SSC meetups is plausibly worthwhile:
When smart people are skeptical of some of my weird beliefs, eg that AI x-risk is really important, or that they should consider working on EA stuff, or that long term we should consider radically restructuring the world to make it better for animals, a lot of the time their disagreement stems from something true about the world that the arguments they've seen didn't address. This is hard to avoid because if you try to write an argument that addresses all the potential concerns, it will be incredibly long. But this makes me think that it's often really high impact for people who have thought a lot about these arguments to talk to people who have heard of them but felt very unpersuaded.
My predictions mostly matched my impressions of what happened.
But I think you might be able to get many of these benefits more efficiently by doing something more like a residency, where you spend a relatively long time in each city, compared to doing a tour, for a few reasons:
- If you want to spend 20 hours talking to people in a city, you can do this more efficiently if you choose the best twenty hours in a ten day period, rather than having to do 20 contiguous hours. We missed out on a bunch of good opportunities because we were only in each city for less than 24 hours. It meant that if people weren't free right when we were in the city, we couldn't talk to them.
- Spending more time in a city means that your travel costs are amortized over more hours.
- If you only do a few hours of talking to people each day, you might be able to do it with your social time budget, and not sacrifice many hours of normal work.
Here's an example plan:
- For a month, Alice does her usual EA work from the US East Coast.
- She stays mostly with EAs who she wants to spend more time getting to know.
- She tries to get quite a lot of normal work done while she’s there.
- She tries to meet a bunch of EAs/EA-relevant-people in her evenings. She goes to meetups and parties. Maybe she tries to get people to host a few extra meetups and parties. (If she has some kind of reputation that allows her to draw a crowd, this can also benefit the local EA groups.)
- She stays mostly in Boston, but she also spends a weekend in New York and spends a few days at Yale or at other places with a bunch of promising EAs. (I think it can be nice to be in a place twice, so that the first time you meet people and the second time you meet the people those people wanted you to meet, and you can see some people twice, which I think is often pretty helpful, so she does.)
Here are the main costs:
- Logistical inconveniences
- travel time
- hopefully this isn't too bad if you can take trains (on which you can work), which you can do if you’re not in a hurry. One lesson from the SSC roadtrip is that travel is substantially more inconvenient when you have time pressure.
- organizing events and conversations with people
- Work efficiency penalties.
- You might sleep worse. You could get around this by sleeping in hotels, or being vigilant about ensuring a good sleep environment when sleeping at the houses of EAs.
- Some types of work are a lot more efficient with in-person collaboration. Maybe your job doesn’t involve that kind of work, or maybe you could arrange to not do that kind of work for a month.
- You might have a worse work environment. You could get around this if you made sure you had eg a large monitor everywhere you were going to be trying to work. Planning to work out of a WeWork or something could also be helpful for this.
- Most people don’t like being far away from home for long
Here are the properties that I think make someone a good candidate for this, other than being an EA with interesting experience or perspectives or similar:
- It seems good if you’re generally likeable, and interesting in conversation, and have a good sense of social appropriateness in unfamiliar situations. Neel Nanda, a Cambridge student who IMO has good judgement about EA outreach, said: “I have a slight apprehension that an outsider from a more established community trying to do outreach in smaller ones could come across as patronising/arrogant. I'd also be very concerned about the person coming across as weird (essentially the unfamiliar situations point)”. I think this is a crucial point.
- One obvious subtlety here is that different people vibe differently well with each other. Neel says, “I'd also expect cultural differences to be quite important and a potential source of failure modes, especially if someone has well calibrated social skills in their social context but doesn't properly account for the change in context, eg a Bay Area EA coming to the UK would probably come across as very outgoing, potentially offputtingly weird/arrogant/confident, and I think the baseline level of social confidence eg requesting 1 on 1s probably differs a lot”.
- You could try to address this concern by going with someone with complementary social skills
- I think it’s also much easier to pull this off if you understand things about the local EA scene like their interests and culture, and if you know some local members reasonably well already, so that you can get rapid feedback from them about how this is going.
- I think that this probably works much better if the EA in residency isn’t trying to represent all of EA, they’re just trying to represent themselves, as an EA who has opinions about things, and they make it clear that they are not a representative of all of EA. If you do this, you’re less making a claim about your own legitimacy, you make it clearer that you’re not speaking for all of EA (which frees you up to share your nonstandard EA opinions), and people might jump less to the conclusion that all EAs have the same beliefs as you.
- It’s good if talking to strangers is more often fun and fulfilling than stressful or tiring. You could try to set it up so that you were mostly talking to people in ways that aren’t as tiring, by for example mostly talking to people who you feel you can relax about.
- Having work that you can do fairly effectively remotely.
- Having a lot of familiarity with the EA community and EA cause areas, and a lot of enthusiasm for talking and thinking about this stuff.
- I think that being interested in EA outreach is somewhat helpful, because a lot of the people you talk the most to do EA outreach work (eg running EA groups) and are interested in talking about it.
- Being good at quickly getting a sense of people, so that you can eg spot whether this person should be introduced to a particular person you know (being well-connected also helps for this).
I think that it might be good for people to do residencies like this. If you're interested in doing this, I'm interested in talking to you about it. I'm also interested in talking to people who live in places which they think would benefit from this type of thing.
Great idea! EA Berlin (Germany) is happy to welcome traveling EAs and organize meetups and/or residencies.
I'm coordinating the EA Berlin chapter, we have a rather active EA community with ~30 regulars, ~200 people who joined at least one event and ~500 followers (both students and professionals), plus an active Rationality community. We have regular events (weekly meetups, monthly speaker events, socials and retreats) and some 10 people are also studying / working on AI-related subjects and interested in AI Safety.
However, one thing we're somewhat lacking is personal contact with EA orgs. Most of the most engaged EAs moved to the UK/US to work at an EA org, and while we have a few people in Berlin working at EA orgs remotely, we could definitely benefit from more (personal) contact with EA orgs, and they might find interesting people here as well. I'd be happy to welcome traveling EAs and (if they wish) organize meetups, connect them to others for 1-1's, give travel advice and organize a residency (arrange couchsurfing or give hostel/hotel recommendations).
Buck, not sure how relevant Berlin is as it's not the US, but if you want to chat nevertheless, feel free to email me! manuel.allgaier@ea-berlin.org
Related idea: Make it a habit to reach out to local EA community when traveling?
Lots of EAs already travel quite a bit for work already (e.g. visiting conferences, retreats etc.) or in their vacation. I as EA Berlin group organizer am happy to organize a meetup with the EA Berlin community whenever someone interesting visits, but I often don't hear about them visiting if they're not already in my network or proactively reach out via the EA Berlin website or Facebook, which some do but some don't.
If every EA who's generally open to meet people makes it a habit to reach out to the local community when traveling (e.g. by looking them up on the map of the EA Hub (www.eahub.org)), then the local group organizers could organize a meetup or connect them to specific people for 1-1's, thereby giving them a low effort way to meet interesting people. This would also eliminate / mitigate risks of coming across as patronizing, arrogant or weird.
This could work even if there's no organized group in a city, as long as at least one of the local EAs filled out their profile on the EA Hub.
General Positive Notes:
I think building relationships between EA professionals and groups is highly valuable and think that programs such as residencies could be really beneficial.
As someone, who had not met too many EA professionals (outside of community builders) until fairly recently I can at least attest to how beneficial it was for me. I was able to have deep discussions on EA issues with those who knew more about EA than anyone I had met before. This led to me changing some of my ideas on things and generally having a better understanding of where EA stands on things. (My answer: a lot of disagreement)
I learned a lot and now have a much better network of EA connections which I have been increasingly realizing the importance of.
There is a possibility that my experience was especially beneficial since I met many professionals at once. (Through a MIRI workshop, Community Building Grantee Retreat, and EAG London all within a span of 3 weeks. ) I think this was particularly beneficial and encourage programs that allow this type of interaction between EA Professionals and group leaders in the future.
However, I still see the prospect of a residency program as very promising. Many of my group members (Yale EA) will not be able to attend EAG (especially with EAG accepting fewer undergraduates) and have very few opportunities to meet EA Professionals. (EAGx is a possible place for this however there are a lot fewer EA professionals at EAGx as compared to EAG)
In my one-on-ones and our community surveys we have heard one element of feedback again and again which is the desire to hear from people who have careers in EA and more experience than us.
As for things like the SSC trip - the group was only able to visit for a very short amount of time. (Most had to leave after about an hour I believe) There were a number of members that I think could have really benefited from meeting them that weren't able to because they were unavailable at that time. So I think the point of having a larger window to meet people is a very good one.
General Worries:
I really agree with Neel's concerns in this write-up. This is particularly potent with my group as many did not come from a rationalist background and some have had poor experiences when meeting Bay Area rationalist EAs. I think the right person could avoid these problems by communicating with group leaders beforehand as mentioned in this post. I still do worry though that it is just generally difficult to change aspects of one's personality in different contexts.
I also strongly second the point that this person does not act as a representative of the EA community as a whole. I think this would need to be very clearly communicated. I can imagine situations in which group members who had never met an EA professional before immediately take what this particular one says as a testament to what EA as a whole thinks. I also worry that if people had a particularly negative interaction that they might generalize that to EA as a whole as well. I think this person would have to maintain awareness that this might be happening throughout the program and actively seek ways to reduce the possibility.
I also think there are other ways to increase interaction between EA professionals and group members but am not sure what exactly those would look like. We have had some good interactions with guest speakers in our dinners following talks but these are often larger groups than optimal. It would be great to have more EA professionals at EAGxs and have career fairs similar to those at EAG. (This might happen in some - Fair warning that I am basing all of my knowledge of EAGxs on EAGx Boston)
Overall, I am generally in favor of something like this residency program happening and think it would be great if Yale EA got to be a stop. I think that if the right person is picked and is aware of these concerns that this could definitely be very positive.
I have low confidence in this, but I'm pretty excited about this idea! I've had many more conversations with to people super into EA over the last few months and this has definitely had a major impact on me, especially with regards to getting a better understanding of the ideas, and just making things concrete. Going from "this is some weird abstract stuff" to "these are ideas that some super awesome and smart people believe, and that I could realistically apply in my life or build my career around".
I'm somewhat biased, because I personally much prefer talking to people to eg reading things. I think a large part is just really liking the people and finding them interesting. I also got a lot of this value from going to parties and being in an EA social environment, which this wouldn't directly generalise to, but I conjecture that someone explicitly trying to create a good environment for this could do much better?
I'm wondering how much of the value of this could be captured by just having calls with people interested in EA but not at EA Hubs? This seems like it cuts out a lot of the logistical hassle of a residency, though at the cost of not being able to go to meetups, and losing out on the in-person interaction. I think it could capture much of the value of talking to someone highly into EA though.
This sounds good, but really hard to pull off well. I personally found that "highly dedicated EAs who have spent a lot of time thinking about this sometimes disagree on important points" only really felt visceral to me after having several IRL conversations with smart people who held different viewpoints. And after only talking to one person, it's easy for their view and justifications to dominate, especially if they've thought about it a lot more than I have. Even if they give frequent caveats of "this is just my opinion", I don't think that feels visceral in the same way as talking to somebody really smart.
Suggested patches:
Some further thoughts from previous discussions with Buck:
For 1 on 1 chats with people super into EA (I've had a reasonable amount of experience being on both sides of this), I think one big failure mode is not being sure what to talk about. Eg, if I'm talking to somebody who actively researches an area that interests me, there's obviously a lot of things you know a lot about that I'd find it interesting to talk about, but I struggle to come up with good questions to access those. I also expect this to be exacerbated if you're having many conversations with people already somewhat engaged with EA, as you first need to figure out their prior level of context and knowledge. This seems a difficult problem to solve, a few ideas:
(Being on either side of these conversations and not knowing what to talk about is a problem I frequently run into, so I'd love to hear anyone's suggestions for helping with this generally!)
Another potential failure mode is that I'd also guess there are a lot of people who might really benefit from a 1 on 1 who might feel socially awkward expressing interest or trying to arrange one, eg concern about taking up the person's time, that they're not impressive enough, general social anxiety/aversion to meeting a stranger 1 on 1, etc. Immediate thought for how to partially resolve this is asking local group organisers for introductions, as a friendlier point of contact? I think it'd also help to put a lot of thought into how to market this, for example whether people need to consider themselves high-achievers/high-potential. I think younger EAs systematically underestimate how much more experienced ones want to talk to them (at least in contexts like this, reaching out to people at EAG, etc)
The situation of "a conversation with somebody you'll probably never see again" is weird, and the way to maximise impact probably differs from how I'd normally approach a conversation, since much of the value will come from things they do on their own after the conversation without (much) further prompting. Best levers to pull are probably suggesting options they wouldn't have considered, eg career paths, or more generally challenging the narrative they're framing their life with (though this seems high variance); connecting them with useful people to speak to; Buck's argument about understanding their view of core EA arguments and addressing objections; and pointing them towards good resources they wouldn't otherwise have found
Here are some of my thoughts on EA residencies/moving people into the full-time EA recruiting pipeline that I shared with Buck:
Bottlenecks
The primary bottlenecks preventing people (who are already interested in EA) from doing high-impact EA work full-time from what I’ve seen in no particular order (based on 2 years running Stanford EA and a few conversations with non-student EAs and community group leaders):
1. Full time EA work, and the transition required feels too costly (in terms of time, money, moving, social costs, preserving optionality, sunk cost fallacy, mental/physical energy, etc.) compared to the path of least resistance
2. Not having (or thinking they don’t have) the right skillset for high-impact EA work (specifically the paths that 80K recommends)
3. Lack of belief that they can be (really) impactful (not trying because why bother)/ High levels of uncertainty about whether investing the time to try to pursue EA work full-time will pan out. (This was really huge for me - once I updated towards thinking I could be impactful if I just tried really hard a lot of the other bottlenecks solved themselves somewhat).
4. Different models of the world (e.g. different credences for person affecting views, different cause prioritization, differing views on the importance of earning to give, relative impact of working on different cause areas, etc.)
5. Lack of clarity on how to progress/next steps
6. System 1 misalignment (Wanting to want to do impactful things in theory but in practice preferring other things for various reasons, some of which are listed above).
Prioritizing the Bottlenecks to address:
The time needed to successfully address each of these can be drastically different (at least this was the case for me, and I still grapple with a few of the above bottlenecks). Ability to move the needle on each of these probably varies a lot by person and the relationship between the EA resident and whomever they’re speaking to.
For example it might be hard for a stranger (the person doing the EA residency) to convince an interested EA that their specific skillset is actually valuable for EA, but it might be easier to clear up misunderstandings that lead to different world views (but even here the willingness of people to meaningfully update might depend a lot their relationship with the person). Although on the skillset front maybe listing what skills are needed for several kinds of important jobs, and which of these can be learned/developed with practice can give people a better sense of what might suit them, since it’s probably hard to learn enough about someone’s skillset from limited interactions/familiarity to give good personal advice.
This being said, I think that if you’re able to change someone’s mind about certain important things, other bottlenecks will resolve themselves (for example once my self-belief increased, I felt motivated to tackle my uncertainty about next steps and work on my career plan).
I think EAs with authority/clout/working at an EA org can particularly help with some of these - #2, 3, and 4 depending on how knowledgeable and good at communicating ideas they are (the resident EA). And #5 is also probably doable if you’re good at career-coaching/problem solving. It seems like being good at EA career coaching would be a really useful skill for someone doing an EA residency (good knowledge of EA landscape, what the gaps are, what skills are needed to fill the gaps, good at figuring out what someone’s skillset is, good at communicating, motivating people, etc).
Helping move the needle on 1 and 6 seem really important, and pretty time-intensive to change on average, and probably hard to do so predictably/reliably. It’s also unclear how much an EA residency can help with this. I don’t think it’s impossible but explicitly thinking about how to do this seems good. For example, reading Strangers Drowning and On Caring by Nate Soares really helped me with #6 and a few other serious EAs I know (at least 3 but I haven’t had this conversation with many people). Maybe certain types of conversations can also be reliably/reproducibly high impact.
Logistics:
It might be hard for an outsider to integrate into various social settings/I’m not sure realistically how much high-quality social interactions you’d be able to have during a residency, especially with students who are busy/not the best at managing time. I’d imagine planning sufficiently far enough in advance can help a lot with this. If you know/reach out to an organizer in the area you can coordinate with them to set up a bunch of 1:1s and group discussions with promising members.
I think these residencies could be really useful, especially in places where people don't have access to full-time EAs very regularly (basically anywhere other than Oxford/London/Bay Area) to mobilize people, and help them see that doing EA work full-time (to be clear not necessarily at an EA org) is a real, viable option. The above points were just things to take into account to make the residencies go well.
If anyone is ever travelling to Manila (Philippines), we in EA Philippines would be happy to meet more EAs!
[Content note: contains fundraising-y content, I'll let the mods decide what they want to do. Doing this in a personal capacity.]
I find myself with an awkward 9 days between a team retreat in FL and a family Xmas gathering in Atlanta. I would guess I would be a fairly good fit to trial this as I have two years of experience at a community building EA org (CEA, where I currently work) and did local group organizing before that.[1]
I'm considering doing something other than flying back to California for that time, potentially doing something like an EA residency in an east coast city. If someone wanted to see this happen and offered to cover my expenses, I'd be more likely to do it. If someone wanted to see this happen in their city and would lend a couch, that would also be good. If someone merely has a suggestion of a city / event to visit, that’s still helpful.
[1] I can list more attributes including caveats and potential reservations if people express interest.
This seems like a good idea and definitely the thing I'd consider once I learn enough about ai that this would be valuable for others.