All of Sunny1's Comments + Replies

In my view one of the most defining features of the EA community is that it makes most people who come into contact with it feel excluded and "less than," on several dimensions. So it's not just you!

1
alene
2y
Yes.

I think the centrality of the EA Forum to the overall "EA project" has likely caused a lot of unintended consequences like this. Participating in the Forum is seen as a pretty important "badge" of belonging in EA, but participating in an internet forum is generally not the type of activity that appeals to everyone, much less an internet forum where posts are expected to be lengthy and footnoted.

Imma
2y39
0
0

Participating in the Forum is seen as a pretty important "badge" of belonging in EA,

Why do you believe this is true? I've met - online and offline - many higly involved people who never post or comment on the forum.  Maybe that's even the majority of the EA people I know. Some of them even never or seldom read anything here (I guess).

Great post. I appreciate the framing around the real gaps in human capital. One additional concern I have is that aesthetics might play a counterproductive role in community building. For example, if EA aesthetics are most welcoming to people who are argumentative, perhaps even disagreeable, then the skill set of "one-on-one social skills and emotional intelligence" could be selected out (relatively speaking). 

Just as a casual observation, I would much rather hire someone who had done a couple of years at McKinsey than someone coming straight out of undergrad with no work experience. So I'm not sure that diverting talented EAs from McKinsey (or similar) is necessarily best in the long run for expected impact. No EA organization can compete with the ability of McK to train up a new hire with a wide array of generally useful skills in a short amount of time. 

9
Nathan_Barnard
2y
I think the key point here is that it is unsually easy to recuirt EAs at uni compared to when they're at McKinsey. I think it's unclear if a) among the the best things for a student to do is go to McKinsey and b) how much less likely it is that an EA student goes to McKinsey. I think it's pretty unlikely going to McKinsey is the best thing to do, but I also think that EA student groups have a realtively small effect on how often students go into elite coporate jobs (a bad thing from my perspective) at least in software engineering.  

Well, no one has the "real" answers to any of these questions, even the most EA of all EAs. The important thing is to be asking good questions in the first place. I think it's both most truthful and most interpersonally effective to say something like "gee, I've never thought about that before. But here's a question I would ask to get started. What do you think?"

I really liked CEA's "tour of duty" framing for the recent hiring rounds!  I thought that was a great signal to potential candidates of what they could expect in the job. I think employers should be more explicit with candidates about what they're expecting from new hires in terms of tenure.

Conversely, I would encourage job applicants to be candid with employers about their tenure expectations for a role. For some roles, only staying in the role for 1-2 years is perfectly fine. For other roles, especially ones that require a lot of role-specific knowl... (read more)

Strong +1 for squad goals :)

I just wanted to reinforce the point Benjamin made above about getting involved in the EA community. For example, if you apply for a job at an EA organization, they may request references from the EA community in addition to the standard references from your last job. Do you already have strong references from credible people in the EA community? If not, it would be worthwhile to do more networking. You may also need to build up a track record of EA volunteer work, post on the EA forum, and so on to build up your own EA  track record.

Here's one way to... (read more)

7
tamgent
3y
[This comment isn't a reply to your main point, just about the 'glamour factor' that your film analogy is predicated on, sorry] I think that the majority of people who believe working at an EA org is the highest impact thing they could do are probably wrong. Consider: 1) if you work at an EA org you probably have skills that are very useful in a variety of other fields/industries. The ceiling on these impact opportunities is higher, as it uses more of your own creativity/initiative at a macro level (e.g. the level of deciding about where to work) 2) if 1) is not true, it's probably because you specialise in meta/EA/movement related matters, that don't transfer well outside. In this case you might be able to make more impact in EA orgs. But this is not the case for most people. I think it's different for people starting new EA orgs, or joining very early-stage ones - that does seem to have a high ceiling on potential impact and is worth a shot for anyone doing it. 
4
Kirsten
3y
This is very accurate but a little sad to me.

Congratulations!  I'm very excited about this project and I'm looking forward to following along. A couple of questions:

  1. Do you plan to do any thinking / writing about why EA's might choose to prioritize climate giving? Climate seems to occupy a weird space in EA. It's more "middle termist" than short-termist or long-termist, so it doesn't neatly fit into either of those camps. And climate affects animals, humans living today, and future humans, so it also doesn't align neatly with those camps either. It's a bit of an "all of the above" cause area. Any
... (read more)

Thanks for this post. In my view, one of the most important elements of the EA approach is the expanding moral circle. The persistence of systemic racism in the US (where I live) is compelling evidence that we have a long way to go in expanding the moral circle. Writ large, the US moral circle hasn't even expanded enough to include people of different races within our own country. I think this inability to address systemic racism is an important problem that could have a negative effect on the trajectory of humanity. In the US, it's obvious that systemic r... (read more)

Agreed. And, it would be great to have a similar top-level post for the "new" GWWC once it launches describing what is in and out of scope. In particular, it would be helpful to know if GWWC is intended to be 1) an EA recruitment pipeline; 2) an end in itself, i.e., driving impact through donations; or both? It seems that charitable giving has fallen out of favor relative to changing careers as an impact lever since I pledged in 2015. I'm curious to know if the leaders of CEA / GWWC see its mission primarily as driving charitable giving or as recruiting new EAs.

Thanks for the question as it caused me to reflect. I think it is bad to intentionally misrepresent your views in order to appeal to a broader audience, with the express intention of changing their views once you have them listening to you and/or involved in your group. I don't think this tactic necessarily becomes less bad based on the degree of misrepresentation involved. I would call this deceptive recruiting. It's manipulative and violates trust. To be clear, I am not accusing anyone of actually doing this, but the idea seems to come up often when "outsiders" (for lack of a better term) are discussed.

I also think, at least in the past, the attitude towards climate work has been vaguely dismissive.

As somewhat of an outsider, this has always been my impression. For example, I expect that if I choose to work in climate, some EAs will infer that I have inferior critical thinking ability.

There's something about the "gateway to EA" argument that is a bit off-putting. It sounds like "those folks don't yet understand that only x-risks are important, but eventually we can show them the error of their ways." I understand that thi... (read more)

7
Stephen Clare
4y
Thanks for sharing that. It's good to know that that's how the message comes across. I agree we should avoid that kind of bait-and-switch which engages people under false pretences. Sam discusses this in a different context as the top comment on this post, so it's an ongoing concern. I'll just speak on my own experience. I was focused on climate change throughout my undergrad and early career because I wanted to work on a really important problem and it seemed obvious that this meant I should work on climate change. Learning about EA was eye-opening because I realized (1) there are other important problems on the same scale as climate change, (2) there are frameworks to help me think about how to prioritize work among them, and (3) it might be even more useful for me to work on some of these other problems. I personally don't see climate change as some separate thing that people engage with before they switch to "EA stuff." Climate change is EA stuff. It's a major global problem that concerns future generations and threatens civilization. However, it is unique among plausible x-risks in that it's also a widely-known problem that gets lots of attention from funders, voters, politicians, activists, and smart people who want to do altruistic work. Climate change might be the only thing that's both an x-risk and a Popular Social Cause. It would be nice for our climate change message to do at least two thing. First, help people like me, who are searching for the best thing to do with their life and have landed on climate because it's a Popular Social Cause, discover the range of other important things to work on. Second, help people like you, who, I assume, care about future generations and want to help solve climate change, work in the most effective way possible. I think we can do both in the future, even if we haven't in the past.
2
Will Bradshaw
4y
Yeah, I think many groups struggle with the exact boundary between "marketing" and "deception". Though EAs are in general very truthful, different EAs will still differ both in where they put that boundary and their actual evaluation of climate change, so their final evaluation of the morality of devoting more attention to climate change for marketing purposes will differ quite a lot. I was arguing elsewhere in this post for more of a strict "say what you believe" policy, but out of curiosity, would you still have that reaction (to the gateway/PR argument) if the EA in question thought that climate change thought was, like, pretty good, not the top cause, but decent? To me that seems a lot more ethical and a lot less patronising.

Thanks for posting! I took an En-ROADS workshop with a trained facilitator in my local community and I thought it was extremely well done. The organization that built En-ROADS trains facilitators to then teach others about the tool (and about climate).

En-ROADS itself is an example of an intervention whose impact would be difficult to quantify. The goal is to educate as many people as possible about the fundamental dynamics of the climate problem, using well-designed interactive workshops/tools that are based on robust evidence. It seems like a good approac... (read more)