I'd recommend having a strategy for planning your conference
Different types of 1:1s are valuable in different ways, and some are more worth preparing for than others
It's natural to feel imposter syndrome and a sense of inadequacy when surrounded by so many highly competent, accomplished people, but arguably the primary purpose of the conference is for those people to help us mere mortals become highly competent and accomplished too (assuming accomplishment = impact)
I feel very conflicted about the insider/outsider nature of the EA community, and I'm going to keep thinking and talking about it more
Last year I attended EAG SF (but not EAG London 2022), and was newer to the EA community, as well as working for a less prestigious organisation. This context is probably important for many of these reflections.
Conference strategy
I made some improvements to my conference planning strategy this year, that I think made the experience significantly better:
I looked through the attendee list and sent out meeting invitations as soon as Swapcard was available. This way people had more slots free, both increasing their likelihood of accepting my meeting, and increasing the chances they'll accept for the specific time I've requested. This let me have more control over my schedule.
I left half an hour's break in between my 1:1s. This allowed meetings to go on longer if both participants wanted that, as well as gave me some time to process information, write down notes, and recharge.
I initially didn't schedule many meetings on Sunday. This meant that if anybody I talked to on Friday or Saturday suggested I talk to someone else at the conference, I'd have the best chance at being able to arrange a meeting with them on Sunday. This strategy worked really well, as I had a couple of particularly valuable 1:1s on Sunday with people who hadn't originally been on my radar to talk to.
I had a clearer sense of what I was trying to achieve out of the conference this year compared to last. This made it easier to decide who would be valuable to speak to. Everyone who I requested a meeting with accepted, including someone who I regarded as particularly impressive who had specified they weren't going to take many 1:1s - so have a low bar for requesting meetings! With this person in particular I felt a bit starstruck, and regretted not having spent more time preparing specific questions to ask.
Some other meetings were totally fine without preparation though, so I think it's worth considering which ones would be more valuable to prepare for - in my case these were ones with more accomplished folks, or with people who I'm interested in working/collaborating with in the near future.
I broadly had two kinds of 1:1s, both of which were valuable: meetings with people who had a track record in areas that I am considering pursuing, and meetings with people who are in a similar position to me currently. With the former, I had specific questions that I was trying to answer, and was potentially trying to impress them/gauge if they might hire me. With the latter, meetings were more exploratory, more casual, and more about trying to find out if I was missing anything important from my model. I think it's easy to feel like the former is far more valuable than the latter, but I think this is false, and I think scheduling some of these more relaxed meetings can help ease the stress of the conference (as well as provide valuable insight into your current situation and plans).
I've infiltrated the ingroup
Last year, I felt like I had something to prove in all of my meetings. I knew maybe 4-5 people who were at the conference, which is not a lot out of almost 1500, and I worked for a company that nobody in SF has heard of. I was still a prototypical EA (straight white male) in many ways, but I don't have an undergrad, and felt like I lacked any track record of achievements to show that I was competent (and by extension, worthy of other attendees' time).
This year, I had several close friends attending and had interacted with a much larger number of people in some capacity, either on social media or through EA tech events in London. I now work for a FAANG company. I have read the seminal Slate Star Codex posts, I broadly understand what a deceptively aligned mesa optimiser is, and I've speculated as to the true identity of Qualy the Lightbulb on Twitter. I have a legible track record of my competence, and I am fluent in EA jargon. I have infiltrated the ingroup.
I feel very conflicted about this - being part of the ingroup feels great. I feel like I have the respect of people who I view as being extremely talented and successful, and naturally this does wonders for my ego. Having so many common touchpoints has meant that I find it extremely easy to make meaningful connections within the EA community compared with outside. Using jargon to signal that your familiar with the relevant scriptures and ideas lets you skip straight to discussing cruxes, on the understanding that both parties already agree on some number of issues. Other group norms, such as a preference for openness, directness, and epistemic humility also seem better than their alternatives to me - I think these tendencies facilitate more constructive discussion, ultimately (hopefully) leading to greater impact.
But there are many obvious drawbacks to this. The ingroup is primarily made up of exceptionally privileged people (and in some ways I'm glad of this - I want privileged people like myself to be doing more to think about how they can do good with their privilege), and often the people that feel less comfortable around the EA community are less privileged. Other people have articulated these problems with the community better than me, and I don't want to speak for them so won't go into too much detail, but if you're reading this then you probably already know exactly what I mean. The fact that the EA community makes some people feel this way makes me feel really sad, which is hard to square with the happiness I get from the sense of belonging that I personally feel from the community.
I'm not entirely sure what to do about this. Trying to use less jargon seems like a good start, but that would be a costly signal for me, particularly when I feel like I am starting to gain status within the community. I would love to write that I am happy to sacrifice this, but I am human and flawed, and I don't know if I am. Making a conscious effort to treat people the same, regardless of whether I view them as highly accomplished or not, also seems like something I ought to write here, but seems hard (or even impossible) to do in practice.
80k has the idea of spending the initial part of your career acquiring career capital, which can later be traded in for impact. Perhaps EA ought to have the idea of community capital - a greater focus is placed on nurturing and growing a healthy community focused on doing good. Even if this focus detracts from doing the absolute most good possible in the short term, in the long term it could allow for greater impact (and, after all, we do love talking about the long term round here).
I feel like my thinking on these topics is generally pretty immature and lacks nuance, so I'd welcome any thoughts.
Outcomes
To end on a positive note, I found the conference incredibly valuable. I came away feeling like I had three promising paths to explore, and am now planning on trying to move to direct work immediately, rather than gaining more career capital as originally planned. I feel excited about trying to use my career to do good, and I feel excited about EA as a whole. There are jobs I will apply for that I counterfactually wouldn't have, seemingly promising opportunities for collaboration, and I made several new contacts that I think will be mutually beneficial in the future.
The conference itself seemed excellently run, the venue was amazing, as was the food, and I feel very appreciative for both the organisers and event staff for all their time and effort!
Earlier I had a conversation with Yonatan Cale about, among other things, ideas for EA projects that could be looking for founders. My prior is that "ideas are easy, execution is hard" and therefor there are plenty of good ideas, he pushed back on this and cited this thread.
Then I went for a run and tried to think of some ideas. I haven't checked to see if anyone has proposed any of them before, and given that I thought of them off the top of my head, they have likely already been discussed. We had talked about software projects specifically, but not all of these are software-centric. I haven't spent longer than five minutes thinking about any of them, and I think it's unlikely any of them are particularly good; this is an exercise in generating ideas, on the grounds that if there's enough of them, one of them might be good.
Prediction market aggregators that include markets from non-anglophone countries, eg Russia/China/India. Basically Metaforecast but with more markets (maybe I should reach out to Nuño to see if this would be possible?). I don't know if those markets already exist - if not, maybe they could be created.
A system to automate or crowd-source FOI requests, open source the data and provide tools to access and use it.
Basically any of the digital democracy tools Audrey Tang talked about on her episode of the 80000 Hours podcast. Build an MVP, pitch it to small local government, show some success, scale it up. Given the open-source nature of the tools currently used in Taiwan, "build an MVP" could be as easy as just running git clone polis. Low chance of success, potential for huge impact if successful regardless.
A system for polling representative samples of a population and/or having demographic information available about those sampled. Something like mechanical turk but in app-form and not associated with Amazon.
Web scraping/sentiment & quantitative analysis for information on Western/Russian/Chinese/Indian sites, similar to prediction market thing above. If lots of Chinese netizens suddenly start talking about shortening their AGI timelines, what information do they have that folks in the West don't? Similarly, the analysis should be published in the same languages to try to foster a more global community.
Org specifically for AI info security. Could either be for pen testing or building new defensive tools.
I originally thought "something like Guesstimate but more specific to Bayes calculations" but having looked at Guesstimate again, that already looks great.
A scientific journal which better incentivizes high quality research, eg by mandating preregistration, or by rewarding attempts to replicate studies. I presume this has been debated at length already though.
Obviously I know the last two definitely don't have legs, and #1 seems like it might just be submitting a pull request after a weekend or two of work, but still. I'm confident that there is a non-zero amount of value across all of these ideas, that if I thought about them more there's a possibility that they could yield an appreciable amount of value, and that I can think of a large number of similar ideas given more time, with at least some of them being better than the best one of these ideas.
Some quick thoughts following EAG London 2023
TL;DR
Last year I attended EAG SF (but not EAG London 2022), and was newer to the EA community, as well as working for a less prestigious organisation. This context is probably important for many of these reflections.
Conference strategy
I made some improvements to my conference planning strategy this year, that I think made the experience significantly better:
I had a clearer sense of what I was trying to achieve out of the conference this year compared to last. This made it easier to decide who would be valuable to speak to. Everyone who I requested a meeting with accepted, including someone who I regarded as particularly impressive who had specified they weren't going to take many 1:1s - so have a low bar for requesting meetings! With this person in particular I felt a bit starstruck, and regretted not having spent more time preparing specific questions to ask.
Some other meetings were totally fine without preparation though, so I think it's worth considering which ones would be more valuable to prepare for - in my case these were ones with more accomplished folks, or with people who I'm interested in working/collaborating with in the near future.
I broadly had two kinds of 1:1s, both of which were valuable: meetings with people who had a track record in areas that I am considering pursuing, and meetings with people who are in a similar position to me currently. With the former, I had specific questions that I was trying to answer, and was potentially trying to impress them/gauge if they might hire me. With the latter, meetings were more exploratory, more casual, and more about trying to find out if I was missing anything important from my model. I think it's easy to feel like the former is far more valuable than the latter, but I think this is false, and I think scheduling some of these more relaxed meetings can help ease the stress of the conference (as well as provide valuable insight into your current situation and plans).
I've infiltrated the ingroup
Last year, I felt like I had something to prove in all of my meetings. I knew maybe 4-5 people who were at the conference, which is not a lot out of almost 1500, and I worked for a company that nobody in SF has heard of. I was still a prototypical EA (straight white male) in many ways, but I don't have an undergrad, and felt like I lacked any track record of achievements to show that I was competent (and by extension, worthy of other attendees' time).
This year, I had several close friends attending and had interacted with a much larger number of people in some capacity, either on social media or through EA tech events in London. I now work for a FAANG company. I have read the seminal Slate Star Codex posts, I broadly understand what a deceptively aligned mesa optimiser is, and I've speculated as to the true identity of Qualy the Lightbulb on Twitter. I have a legible track record of my competence, and I am fluent in EA jargon. I have infiltrated the ingroup.
I feel very conflicted about this - being part of the ingroup feels great. I feel like I have the respect of people who I view as being extremely talented and successful, and naturally this does wonders for my ego. Having so many common touchpoints has meant that I find it extremely easy to make meaningful connections within the EA community compared with outside. Using jargon to signal that your familiar with the relevant scriptures and ideas lets you skip straight to discussing cruxes, on the understanding that both parties already agree on some number of issues. Other group norms, such as a preference for openness, directness, and epistemic humility also seem better than their alternatives to me - I think these tendencies facilitate more constructive discussion, ultimately (hopefully) leading to greater impact.
But there are many obvious drawbacks to this. The ingroup is primarily made up of exceptionally privileged people (and in some ways I'm glad of this - I want privileged people like myself to be doing more to think about how they can do good with their privilege), and often the people that feel less comfortable around the EA community are less privileged. Other people have articulated these problems with the community better than me, and I don't want to speak for them so won't go into too much detail, but if you're reading this then you probably already know exactly what I mean. The fact that the EA community makes some people feel this way makes me feel really sad, which is hard to square with the happiness I get from the sense of belonging that I personally feel from the community.
I'm not entirely sure what to do about this. Trying to use less jargon seems like a good start, but that would be a costly signal for me, particularly when I feel like I am starting to gain status within the community. I would love to write that I am happy to sacrifice this, but I am human and flawed, and I don't know if I am. Making a conscious effort to treat people the same, regardless of whether I view them as highly accomplished or not, also seems like something I ought to write here, but seems hard (or even impossible) to do in practice.
80k has the idea of spending the initial part of your career acquiring career capital, which can later be traded in for impact. Perhaps EA ought to have the idea of community capital - a greater focus is placed on nurturing and growing a healthy community focused on doing good. Even if this focus detracts from doing the absolute most good possible in the short term, in the long term it could allow for greater impact (and, after all, we do love talking about the long term round here).
I feel like my thinking on these topics is generally pretty immature and lacks nuance, so I'd welcome any thoughts.
Outcomes
To end on a positive note, I found the conference incredibly valuable. I came away feeling like I had three promising paths to explore, and am now planning on trying to move to direct work immediately, rather than gaining more career capital as originally planned. I feel excited about trying to use my career to do good, and I feel excited about EA as a whole. There are jobs I will apply for that I counterfactually wouldn't have, seemingly promising opportunities for collaboration, and I made several new contacts that I think will be mutually beneficial in the future.
The conference itself seemed excellently run, the venue was amazing, as was the food, and I feel very appreciative for both the organisers and event staff for all their time and effort!
Earlier I had a conversation with Yonatan Cale about, among other things, ideas for EA projects that could be looking for founders. My prior is that "ideas are easy, execution is hard" and therefor there are plenty of good ideas, he pushed back on this and cited this thread.
Then I went for a run and tried to think of some ideas. I haven't checked to see if anyone has proposed any of them before, and given that I thought of them off the top of my head, they have likely already been discussed. We had talked about software projects specifically, but not all of these are software-centric. I haven't spent longer than five minutes thinking about any of them, and I think it's unlikely any of them are particularly good; this is an exercise in generating ideas, on the grounds that if there's enough of them, one of them might be good.
git clone polis
. Low chance of success, potential for huge impact if successful regardless.Obviously I know the last two definitely don't have legs, and #1 seems like it might just be submitting a pull request after a weekend or two of work, but still. I'm confident that there is a non-zero amount of value across all of these ideas, that if I thought about them more there's a possibility that they could yield an appreciable amount of value, and that I can think of a large number of similar ideas given more time, with at least some of them being better than the best one of these ideas.