Hide table of contents

I am organizing an EA university group. I am wondering if, in general, putting this on my resume would be good or bad for getting a non-EA job.

11

0
0

Reactions

0
0
New Answer
New Comment


2 Answers sorted by

Absolutely do so! In the eyes of the vast majority of employers, organizing a university group centered around charity shows character and energy, highly positive qualities in an employee.

I agree completely! However, I feel obliged to point out that some EAs I know intentionally play down their EA associations because they think it will harm their careers. Often, these people are thinking of working in government.  

I weakly think this is a mistake for two reasons. Firstly, as Mathias said, because EA appears to be generally seen as a positive thing (similar to climate change action, according to this study). Secondly, I think Ord is right when he says we could do with more earnestness and sincerity in EA. 

Alix, ex-co-director at EA Switzerland, wrote up some interesting thoughts on this general subject here.  

I'd also add that basically no one knows what EA is, and currently, when you do a quick google, you get a good impression (criticism tab aside):
 

(Interested to know if others get the same AI summary- not sure if it regenerates for each user, or just for search terms)

I'll further add that most people aren't going to bother doing the quick Google, they're going to see "organised university society" and whatever two sentence summary you've got about it being charity-related and see it as a positive, although not necessarily any more positive than organising the RAG week charity or a sports team.

The bigger question is if and how you raise it as an answer to a question about your life experiences at interview

(FWIW my ad-blocked Google results for Effective Altruism are this website, the Wikipedia link and a BBC article about SBF)

If you are trying to get a US policy job than probably no, but it also depends on the section of US policy

I don't find comments like these helpful without explanations or evidence, especially from throwaway accounts

1
Throwaway81
The reader can take it or leave it given these facts, but imo it serves as a data point that someone from US Policy is pointing to this real thing.
2
JWS 🔸
Right but I don't know who you are, or what your position in the US Policy Sphere is, if you have one at all. I have no way to verify your potential background or the veracity of the information you share, which is one of the major problems with anonymous accounts. You may be correct (though again that lack of explanation doesn't help give detail or a mechanism why or help sammy that much, as you said it depends on the section) but that isn't really the point, the only data point you provide is "intentionally anonymous person of the EAForum states opinion without supporting explanations" which is honestly pretty weak sauce
2
Throwaway81
shrug I think it would be helpful to me, and like I said the reader can take it or leave it. Thems the breaks. I think commenting from a throwaway account providing the data and letting the reader decide is better than not commenting and not providing data
4
JWS 🔸
But you haven't provided any data 🤷 Like you could explain why you think so without de-anonymising yourself, e.g. sammy shouldn't put EA on his CV in US policy because: * Republicans are in control of most positions and they see EA as heavily democrat-coded and aren't willing to consider hiring people with it * The intelligentsia who hire for most US policy positions see EA as cult-like and/or disgraced after FTX * People won't understand what EA is on a CV will and discount sammy's chances compared to them putting down "ran a discussion group at university" or something like that * You think EA is doomed/likely to collapse and sammy should pre-emptively dissasociate their career from it Like I feel that would be interesting and useful to hear your perspective on, to the extend you can share information about it. Otherwise just jumping in with strong (and controversial?) opinions from anonymous accounts on the forum just serves to pollute the epistemic commons in my opinion.
Curated and popular this week
 ·  · 23m read
 · 
Or on the types of prioritization, their strengths, pitfalls, and how EA should balance them   The cause prioritization landscape in EA is changing. Prominent groups have shut down, others have been founded, and everyone is trying to figure out how to prepare for AI. This is the first in a series of posts examining the state of cause prioritization and proposing strategies for moving forward.   Executive Summary * Performing prioritization work has been one of the main tasks, and arguably achievements, of EA. * We highlight three types of prioritization: Cause Prioritization, Within-Cause (Intervention) Prioritization, and Cross-Cause (Intervention) Prioritization. * We ask how much of EA prioritization work falls in each of these categories: * Our estimates suggest that, for the organizations we investigated, the current split is 89% within-cause work, 2% cross-cause, and 9% cause prioritization. * We then explore strengths and potential pitfalls of each level: * Cause prioritization offers a big-picture view for identifying pressing problems but can fail to capture the practical nuances that often determine real-world success. * Within-cause prioritization focuses on a narrower set of interventions with deeper more specialised analysis but risks missing higher-impact alternatives elsewhere. * Cross-cause prioritization broadens the scope to find synergies and the potential for greater impact, yet demands complex assumptions and compromises on measurement. * See the Summary Table below to view the considerations. * We encourage reflection and future work on what the best ways of prioritizing are and how EA should allocate resources between the three types. * With this in mind, we outline eight cruxes that sketch what factors could favor some types over others. * We also suggest some potential next steps aimed at refining our approach to prioritization by exploring variance, value of information, tractability, and the
 ·  · 5m read
 · 
[Cross-posted from my Substack here] If you spend time with people trying to change the world, you’ll come to an interesting conundrum: Various advocacy groups reference previous successful social movements as to why their chosen strategy is the most important one. Yet, these groups often follow wildly different strategies from each other to achieve social change. So, which one of them is right? The answer is all of them and none of them. This is because many people use research and historical movements to justify their pre-existing beliefs about how social change happens. Simply, you can find a case study to fit most plausible theories of how social change happens. For example, the groups might say: * Repeated nonviolent disruption is the key to social change, citing the Freedom Riders from the civil rights Movement or Act Up! from the gay rights movement. * Technological progress is what drives improvements in the human condition if you consider the development of the contraceptive pill funded by Katharine McCormick. * Organising and base-building is how change happens, as inspired by Ella Baker, the NAACP or Cesar Chavez from the United Workers Movement. * Insider advocacy is the real secret of social movements – look no further than how influential the Leadership Conference on Civil Rights was in passing the Civil Rights Acts of 1960 & 1964. * Democratic participation is the backbone of social change – just look at how Ireland lifted a ban on abortion via a Citizen’s Assembly. * And so on… To paint this picture, we can see this in action below: Source: Just Stop Oil which focuses on…civil resistance and disruption Source: The Civic Power Fund which focuses on… local organising What do we take away from all this? In my mind, a few key things: 1. Many different approaches have worked in changing the world so we should be humble and not assume we are doing The Most Important Thing 2. The case studies we focus on are likely confirmation bias, where
 ·  · 1m read
 · 
I wanted to share a small but important challenge I've encountered as a student engaging with Effective Altruism from a lower-income country (Nigeria), and invite thoughts or suggestions from the community. Recently, I tried to make a one-time donation to one of the EA-aligned charities listed on the Giving What We Can platform. However, I discovered that I could not donate an amount less than $5. While this might seem like a minor limit for many, for someone like me — a student without a steady income or job, $5 is a significant amount. To provide some context: According to Numbeo, the average monthly income of a Nigerian worker is around $130–$150, and students often rely on even less — sometimes just $20–$50 per month for all expenses. For many students here, having $5 "lying around" isn't common at all; it could represent a week's worth of meals or transportation. I personally want to make small, one-time donations whenever I can, rather than commit to a recurring pledge like the 10% Giving What We Can pledge, which isn't feasible for me right now. I also want to encourage members of my local EA group, who are in similar financial situations, to practice giving through small but meaningful donations. In light of this, I would like to: * Recommend that Giving What We Can (and similar platforms) consider allowing smaller minimum donation amounts to make giving more accessible to students and people in lower-income countries. * Suggest that more organizations be added to the platform, to give donors a wider range of causes they can support with their small contributions. Uncertainties: * Are there alternative platforms or methods that allow very small one-time donations to EA-aligned charities? * Is there a reason behind the $5 minimum that I'm unaware of, and could it be adjusted to be more inclusive? I strongly believe that cultivating a habit of giving, even with small amounts, helps build a long-term culture of altruism — and it would