Manuel Allgaier

850 karmaJoined Wedding, Berlin, Germany

Bio

Former director of EA Germany and EAGxBerlin 2022 event lead, currently on a career break to explore some longtermist & AI Safety ideas as well as work on personal (non-public) projects

 

Bio: I worked full-time in EA movement building (funded by CEA) as Director of EA Berlin (2019-21) and Director of EA Germany (2021-22) and EAGxBerlin Event Lead (2022). Previously, I worked in sustainability consulting and charity management, studied environmental science, economics & IT and volunteered in lived and worked in Phnom Penh (Cambodia), Amsterdam and Berlin.

https://www.linkedin.com/in/manuelallgaier/

How others can help me

If you have any ideas for EA Berlin or would like to get involved, I'd be happy to hear from you! Just message me here on the forum or on Linkedin.

Feedback on me and my work is always welcome: bit.ly/ea_anonymous_feedback

Comments
117

One additional reason:

If you get your (initial) training from a neutral-ish impact organisation, like some management consulting or tech companies, and then move on to a high impact job, you can add value right away with less 'training costs' for the high impact org = more impact.

All else equal, an EA org with staff with 1-3 years (non-EA) job experience can achieve more impact quicker than one with partly inexperienced staff.

That said, some things such as good epistemics or high moral integrity may be easier to learn at EA orgs (though they can definitely also be learned elsewhere).

I've supported >100 people in their career plans, and this seems pretty solid but underappreciated advice. Thanks for writing it up!

I think I made that mistake too. I went for EA jobs early in my career (running EA Berlin and then EA Germany 2019-22, funded by CEA grants). There were some good reasons: This work seemed particularly neglected in 2019-21, it seemed a good fit and three senior people I had in-depth career 1-1s with all recommended it. I learned a lot, met many inspiring people and I think I did had some significant positive impact as well, on the community overall (it grew and professionalized) and on some individual member's careers.

However, I made a lot of mistakes too, had slow feedback loops (no manager, little mentorship), and I'm pretty sure I would have learned many (soft) skills faster and built overall better career capital (both in- and outside of EA), if I had first spent 1-2 years in management consulting or in a fast growing (non-EA) tech company with good management, and then went on to direct EA work.

I agree that it would be good to have citations. In case neither Ozzie nor anyone else here finds it a good use of their time to do it - I've been following OpenAIs and Sam Altman's messaging specifically for a while and Ozzie's summary of their (conflicting) messaging seems roughly accurate to me. It's easy to notice the inconsistencies in Sam Altman's messaging, especially when it comes to safety. 

Another commenter (whose name I forgot, I think he was from CLTR) put it nicely: It feels like Altman does not have one consistent set of beliefs (like an ethics/safety researcher would) but tends to say different things that are useful for achieving his goals (like many CEOs do), and he seems to do that more than other AI lab executives at Anthropic or Deepmind. 

This could be a community effort. If you're reading this and have a spare minute, can you recall any sources for any of Ozzie's claims and share links to them here? (or go the extra mile, copy his post in a google doc and add sources there?). 

It's easy to vote for something you don't have to pay for. If we do anything like this, an additional fundraiser to pay for it might be appropriate.

Earning to Give still seems the best way to contribute for many people (e.g. people with exceptionally high earning potential, or people with decently paying jobs who aren't a good fit for direct work or don't want to switch jobs). I don't think we should distance ourselves from it. 

While I'm also interested in the finances, I fully understand if they prefer not to share all this info publicly. Afaik it's not common to share such detailed financial statements publicly, even for non-profits. 

+1, I'd find this very useful too! 

For context: After working full-time in EA meta for >3 years, I've been thinking about renting or buying property in/near Berlin or in a cheaper place in Europe to facilitate EA/longtermist events, co-working and maybe also co-living. I know many others are thinking about this too, some of whom area already making plans, and such retrospectives would be really helpful to inform our decisions. If you prefer not to share it publicly, you can also email me

From the limited info I have, Wytham Abbey seemed a good idea at the time, and I appreciate you going for it! The decision to sell was probably pretty hard to make, I hope all involved feel good about it now. 

I like the idea but I also doubt a significant number of Palestinians from Gaza would accept that.

A more realistic option might be to try to make it easier for Palestinians to emigrate to any country that accepts them. Maybe Israel could offer everyone who successfully emigrates a $5-10k incentive / "starting budget for a new life abroad" and then maybe some other countries would be more interested in taking them if they have more money to spend. It might also help that most Gazans are young, speak Arabic and many also English, and could help fill labor shortages in other countries. 

Which countries might be most open to accepting Palestinian migrants (as refugees, student migrants or work migrants)? Any effective pro-migration interventions there?

This is helpful, thanks!

I notice you didn't mention fundraising for AI safety.

Recently, many have mentioned that the funding bar for AI safety projects has increased quite a bit (especially for projects not based in the Bay and not already well connected to funders) and response times from funders such as EA Funds LTFF can be very long (median 2 months afaik), which suggests we should look for more additional funding sources such as new high net worth donors, governments, non-EA foundations etc.

Do you have any thoughts on that? How valuable does this seem to you compared to your ideas?

Load more