I read Philosophy, Politics, and Economics at the University of Warwick, lead the local EA group, co-moderated our first fellowship, participated in the community building residency at CEA in summer 2019. When I was in High school, I read Pogge and about absolute Poverty. That`s how I became involved in EA and a Giving What we Can Member.

I learned about outer space governance during a research project with the German foreign service. I supported the Future of Life Institute in response to the EU AI Whitepaper Consultation. I spend my time thinking about Global Priorities Research (portfolio theory), AI governance, Economics, ML, and building effective altruism. In the next years, I would like to work on longtermist policy problems or skill up as an economist to contribute to GPR and policy.

I am currently working on GPR and a project a longtermist institution in Europe.


Sorted by New


How high impact are UK policy career paths?

Thanks for writing this. Here are two of my messy thoughts: If you believe that X is the biggest and most important problem (e.g. clean meat, poverty alleviation or AI governance), I would believe that the Head of the relevant department is a really really good job to work on the problem.

I was also wondering why you are not considering the career capital you get to later on work on projects such as Alpenglow or work in applied research job/ lobbying/policy thinkers etc.

What areas are the most promising to start new EA meta charities - A survey of 40 EAs

Thanks for sharing. Would you be able to share more information on the top-ranked option "exploration". My thinking on this is limited (like in general regarding a cause X). Would you able to share concrete ideas people talked about or concrete proposed plans for such an organisation (a cause X organisation or an organisation focused on one particular underexplored cause area?)


And on a related note, will you publish the report about meta charities you describe here publish before the incubation programme application deadline (as it might be decision-relevant for some people)?

Careers Questions Open Thread



I am german, lead an EA group in the UK, and do EA career coaching there. I am personally interested in the policy side but I am happy to talk with you through your cause prioritisation and think about good jobs in Germany. If you are interested, pm me :)

Andreas Mogensen's "Maximal Cluelessness"

Sorry, I don't have the time to comment in-depth. However,  I think if one agrees with cluelessness, then you don't offer an objection. You might even extend their worries by saying that "almost everything has "asymmetric uncertainty"".  I would be interested in your extension of your last sentence. " They are extremely unlikely and thus not worth bearing mind". Why is this true? 

Andreas Mogensen's "Maximal Cluelessness"

re: your lady example: as far as I know, the recent papers e.g. here provide the following example: (1) either you help the old lady on a Monday or on a Tuesday (you must and can do exactly one of the two options).  In this case, your examples for CC1 and CC2 don't hold. One might argue that the previous example was maybe just a mistake and I find it very hard to come up with CC1 and CC2 for (1) if (supposedly) you don't know anything about Mondays or Tuesdays.

Has anyone gone into the 'High-Impact PA' path?

Sorry about the late answer. I just wanted to say that I also upvoted your comment because I would be very interested in a longer piece on being an RA.

AMA: Tobias Baumann, Center for Reducing Suffering

What is the most likely reason that s-risks are not worth working on?

AMA: Tobias Baumann, Center for Reducing Suffering

How did you figure out that you prioritize the reduction of suffering?

I am interested in your personal life story and in the most convincing arguments or intuition pumps?

The case of the missing cause prioritisation research

Thank you very much for writing this up. However, I am not sure I understand your point, the things you are referring to in:

3. Policy and beyond – not happening – 2/10. Are you referring to your explanation within the subsection on The Parliament? Then, this would make sense for me.

Load More