I am currently the founder and chair of the board of Effective Altruism Israel, I lead Google's Flood Forecasting Initiative alongside several other humanitarian and climate-related efforts, am a strategic advisor for Firstime on investing in startups that advance the UN's Sustainable Development Goals, and teach Applied Ethics (and Information Security) in Tel Aviv university.

Feel free to reach out if there's anything EA-related you believe I can help you with.


Introducing Probably Good: A New Career Guidance Organization

We’re definitely taking into account the different comments and upvotes on this post. We appreciate people upvoting the views they’d like to support - this is indeed a quick and efficient way for us to aggregate feedback.

We’ve received recommendations against opening public polls about the name of the organization from founders of existing EA organizations, and we trust those recommendations so we’ll probably avoid that route. But we will likely look into ways we can test the hypothesis of whether a “less controversial” name has positive or negative effects on the reaction of someone hearing this name for the first time.

Introducing Probably Good: A New Career Guidance Organization

Hi Manuel, thanks for this comment. I think I agree with all your considerations listed here. I want to share some thoughts about this, but as you’ve mentioned - this is one of our open questions and so I don’t feel confident about either direction here.

First, we have indeed been giving general career coaching for people in Israel for several years now, so in a sense we are implementing your recommended path and are now moving onto the next phase of that plan. That being said, there still remain reasons to continue to narrow our scope even at this stage.

Second, you mention partnering with experts in the various cause areas to ensure accurate content - I completely agree with this, and wouldn’t dream of providing concrete career advice independently in fields I don’t have experience in. In the content we are writing right now we require interviewing at least 7 experts in the field to provide high-confidence advice, and at least 3 experts in the field even for articles we mark as low confidence (of which we warn people to be careful about). So it’s really important to me to clarify that none of the concrete career-specific advice we provide will be based exclusively on our own opinions or knowledge - even within the fields we do have experience in.

Finally, I think at least some of the issues you’ve (justifiably) raised are mitigated by the way we aim to provide this advice. As opposed to existing materials, which more confidently aim to provide answers to career-related questions, we have a larger emphasis on providing the tools for making that decision depending on your context. As community organizers, one of the things that pushed us to start this effort is the feeling that many people, who don’t happen to be from the (very few) countries that EA orgs focus on, have very little guidance and resources, while more and more is invested in optimizing the careers of those within those countries. We believe that doing highly focused work on Israel would not serve the community as well as providing guidance on what needs to be explored and figured out to apply EA career advice to your own context. As such, we want to provide recommendations on how to check for opportunities within the scope that’s relevant to you (e.g. country or skillset), rather than aiming to provide all the answers as final conclusions on our website. This applies most to our career guide, but also to specific career path profiles - where we want to first provide the main considerations one should look into, so that we provide valuable preliminary guidance for a wide range of people, rather than end-to-end analysis for fewer people.

The mitigations described above can be much better evaluated once we have some materials online, which will allow others to judge their implementation (and not only our aspirations). We plan on soliciting feedback from the community before we begin advocating for them in any meaningful way - hopefully that will help make these responses less abstract and still leave us time to collect feedback, consider it and try to optimize our scope and messaging.

Introducing Probably Good: A New Career Guidance Organization

Hi Jack, thanks for the great question. 

In general, I don’t think there’s one best approach. Where we want to be on the education \ acceptance trade-off depends on the circumstances. It might be easiest to go over examples (including ones you gave) and give my thoughts on how they’re different.

First, I think the simplest case is the one you ended with. If someone doesn’t know what cause area they’re interested in and wants our help with cause prioritization, I think there aren’t many tradeoffs here - we’d strongly recommend relevant materials to allow them to make intelligent decisions on how to maximize their impact. 

Second, I want to refer to cases where someone is interested in cause areas that don’t seem plausibly compatible with EA, broadly defined. In this case we believe in tending towards the “educate” side of the spectrum (as you call it), though in our writing we still aim not to make it a prerequisite for engaging with our recommendations and advice. That being said, these nuances may be irrelevant in the short-term future (at least months, possibly more), as due to prioritization of content, we probably won’t have any content for cause areas that are not firmly within EA.

In the case where the deliberation is between EA cause areas (as is the case in your example) there are some nuances that will probably be more evident in our content even from day one (though may change over time). Our recommended process for choosing a career will involve engaging with important cause prioritization questions, including who deserves moral concern (e.g. those far from us geographically, non-human animals, and those in the long term future). Within more specific content, e.g. specific career path profiles, we intend to refer to these considerations but not try and force people to engage with them. If I take your global health example, in a career path profile about development economics we would highlight that one of the disadvantages of this path is that it is mainly promising from a near-term perspective and unclear from a long-term perspective, with links to relevant materials. That being said, someone who has decided they’re interested in global health, doesn’t follow our recommended process for choosing a career, and navigates directly to global health-related careers will primarily be reading content related to this cause area (and not material on whether this is the top cause area). Our approach to 1:1 consultation is similar - our top recommendation is for people to engage with relevant materials, but we are willing to assist people with more narrow questions if this is what they’re interested in (though, much like the non-EA case, we expect to be in over-demand in the foreseeable future, and may in practice prioritize those who are pursuing all avenues to increasing their impact). 

Hope this provides at least some clarity, and let me know if you have other questions.

Introducing Probably Good: A New Career Guidance Organization

I agree this is an important question that would be of value to other organizations as well. We’ve already consulted with 80K, CE and AAC about it, but still feel this is an area we have a lot more work to do on. It isn’t explicitly pointed out in our open questions doc, but when we talk about measuring and evaluating our counterfactual benefits and harms, this question has been top of mind for us.

The short version of our current thinking is separated into short-term measurement and long-term measurement. We expect that longer term this kind of evaluation will be easier - since we’ll at least have career trajectories to evaluate. Counterfactual impact estimation is always challenging without an experimental set up which is hard to do at scale, but I think 80K and OpenPhil have put out multiple surveys that try to and extract estimates of counterfactual impact and do so reasonably well given the challenges, so we’ll probably do something similar. Also, at that point, we could compare our results to theirs, which could be a useful barometer. In the specific context of our effect on people taking existing priority paths, I think it’ll be interesting to compare the chosen career paths of people who have discovered 80K through our website relative to those who discovered 80K from other sources. 

Our larger area of focus at the moment is how to evaluate the effect of our work in the short term, when we can’t yet see our long-term effect on people’s careers. We plan on measuring proxies, such as changes to their values, beliefs and plans. We expect whatever proxy we use in the short term to be very noisy and based on a small sample size, so we plan on relying heavily on qualitative methods. This is one of the reasons we reached out to a lot of people who are experienced in this space (and we’re incredibly grateful they agreed to help) - we think their intuition is an invaluable proxy to figuring out if we’re heading in the right direction.

This is an area that we believe is important and we still have a lot of uncertainty about, so additional advice from people with significant experience in this domain would be highly appreciated.

Introducing Probably Good: A New Career Guidance Organization

Hi dglid, I agree with your comment. I think there is a lot of value by making career guidance more available to the masses, even without 80K personally being involved.

I see local groups as being the primary type of organization responsible for this type of work - making EA information accessible and personalized for new people and communities. We don’t see ourselves taking over that role. That being said, we are interested in being involved in the process. We know there’s a lot of interest in creating content / tools / support in the career guidance space, both because we’ve seen it in EA Globals and group organizers’ groups, and also because we are group organizers ourselves, and it’s this need that has set us on this path (originally in our own local group).

All of this is to say - I think working with and empowering local EA groups to provide these services is a great way to improve careers at scale, and would especially love any feedback, requests and comments from local group organizers or anyone else on what you believe would be most helpful to you in this area.

Introducing Probably Good: A New Career Guidance Organization

Hi Michael, as you mention - the issue of accurately defining our scope is still an important open question to us. I’m happy to share our current thinking about this, but we expect this thinking to evolve as we collect feedback and gain some more hands-on experience.

I think it’s worth making a distinction between two versions of this question. The first is the longer-term question of what is the set of all cause areas that should be within scope for this work. That’s a difficult question. At the moment, we’re happy to use the diversity of views meaningfully held in the EA community as a reasonable proxy - i.e. if there’s a non-negligible portion of EAs that believe a certain cause area is promising we think that’s worth investigating. As such, all three of the examples you mention would be potentially in-scope in my view. This is not, in and of itself, a cohesive and well-defined scope, and as I mentioned, it is likely to change. But I hope this gives at least an idea of the type of scope we’re thinking of.

The second version of this question is what we actually intend to work on in the upcoming months, given that we are just getting started and we are still constrained in time and resources. This question will dominate our actual decisions for the foreseeable future. Within the large scope mentioned above, we want to initially focus on areas based on two criteria: First, unmet needs within the EA community, and second, cause areas that are easier to evaluate. Both of these are very weak signals for where we want to focus long-term, but drastically influence how quickly we can experiment, evaluate whether we can provide significant value, and start answering some of our open questions. As a concrete example, we believe the Global Health & Development fits this bill quite well, and so at least part of our first career paths will be in this space. 

I hope this helps clarify some of these questions. I apologize if there are more open questions here than answers - it’s just really important to us to experiment first and make long-term decisions about priorities and scope afterwards rather than the other way around.

Introducing Probably Good: A New Career Guidance Organization

That’s actually a great idea. I’ve now added a link from each clean doc to a commentable version. Feel free to either comment here, email us, or comment on the commentable version of the doc. Thanks!

Introducing Probably Good: A New Career Guidance Organization

Great point Pablo.

I think the analogy to ImpactMatters is insightful and relevant, and indeed reaching a broader audience/scope (even at the cost of including less impactful career paths) is part of the justification for this work. I think the difference between inter-cause elasticity and intra-cause elasticity may be even larger when discussing careers, because in addition to people's priorities and values, many people will have education, experience and skills which make it less likely (or even desirable) that they move to a completely different cause area.

I do however also want to highlight that I think there are justifications for this view beyond just a numbers game. As we discuss in our overview and in our core principles, we think there are disagreements within EA that warrant some agnosticism and uncertainty. One example of this is the more empiricist view which focuses on measurable interventions and views speculative work that cannot be easily evaluated or validated skeptically, vs. the more hits-based approach which focuses on interventions that are less certain but are estimated to have orders of magnitude more impact in expectation. These views are (arguably) at the crux of comparisons between top cause areas that are a core part of the EA community (e.g. global poverty & health vs. existential risk mitigation). For many people working in both of these cause areas, we genuinely believe careers within their field are the most promising thing they could do.

Additionally, we not only believe in broader career advice is useful in optimizing the impact of those who would not choose top priority paths, but actually may lead to more people joining top priority paths in the focus areas of existing career orgs in the long run. As we mention in our overview and in our paths to impact, and based on our experience in career guidance so far, we believe that providing people answers to the questions they already care about, while discussing crucial considerations they might not think about often, is a great way to expose people to impact maximization principles. Our hope is that even if we care exclusively about top priority paths already researched by 80K and others, this organization will end up having a net positive effect on the number of people who pursue these paths. Whether this will be the case, of course, remains to be seen - but we intend on measuring and evaluating this question as one of our core questions moving forward.

We're Lincoln Quirk & Ben Kuhn from Wave, AMA!

Thank you both for your thoughtful answers.

To clarify, I don't have a strong opinion on this comparison myself, and would love to hear more points of view on this. Sadly I'm not aware of any reading materials on this topic, but have heard the following arguments made in one on one conversations:

  1. For-profit entrepreneurship has built-in incentives that already cause many entrepreneurs to try and implement any promising opportunities. As a result, we'd expect it to be drastically less neglected, or at least drastically less neglected relative to nonprofit opportunities that are similar in how promising they are. This can affect both our estimate of how much we'd expect to find good opportunities still lying around, and also how we'd estimate our counterfactual impact (as if we hadn't implemented a profitable intervention, there's higher likelihood someone else would).
  2. The specific cause areas that the EA movement currently sees as the most promising - including global poverty and health, animal welfare, and the longterm future - all serve recipients who (to different degrees) are incapable of significantly funding such work. This could be seen as directly related to the first point, but even if the first point is false, one could still argue that it just happens to be the case that the most promising cause areas are not a good fit for for-profit entrepreneurship.  I think the case here applies more strongly to animals and future people (who clearly can't pay for services), but to a lesser extent can also apply to the extremely poor who can pay only very little.
  3. For-profit organizations may produce incentives that make it unlikely to make the decisions that will end up producing enormous impact (in the EA sense of that term). One variation of this argument is that the revenue/growth needs tend to always come first (I can't do any good if I don't exist), which means there ends up being little freedom to optimize for impact. Another variation argues that even if one could optimize for impact, these incentives alongside the environment can cause significant value drift, and many people following this path will end up not doing so.
  4. Finally, I've also heard from several people the claim that today EA has an immense amount of funding, and if you're a competent person founding a charity that works according to EA principles it is incredibly easy to get non-trivial amounts of funding. This is not necessarily an argument for nonprofits, but this potentially somewhat mitigates what is perhaps the strongest argument against nonprofits - access to capital. Somewhat like point 2, this is a more circumstantial argument than an inherent.

Finally, the fact that I listed arguments in favor of nonprofit entrepreneurship over for-profit entrepreneurship may give the impression that this is my opinion, so I want to clarify again that it is not and I am highly uncertain about this topic.

We're Lincoln Quirk & Ben Kuhn from Wave, AMA!

Hi Lincoln and Ben, thanks for doing this! I would love to hear your perspective on the following topic:

Nonprofit entrepreneurship is a dominant career path within EA, with many people excited about the impact that it can achieve. Impact-focused for-profit entrepreneurship is rarely discussed or recommended by EA organizations, with a 2016 article about your startup being one of the only materials on this topic. I have also heard multiple people argue that for-profit entrepreneurship is an inherently less promising path than nonprofit entrepreneurship for various reasons.

What is your view on the value of for-profit entrepreneurship from an EA perspective? Do you believe this career path is undervalued by the EA community and its organizations today? If so, what do you believe people interested in for-profit entrepreneurship should do to found highly impactful organizations? Are there any specific opportunities you think are particularly interesting or exciting in this space?

Thanks in advance!

Load More