Former director of EA Germany and EAGxBerlin 2022 event lead, currently on a career break to explore some longtermist & AI Safety ideas as well as work on personal (non-public) projects
Bio: I worked full-time in EA movement building (funded by CEA) as Director of EA Berlin (2019-21) and Director of EA Germany (2021-22) and EAGxBerlin Event Lead (2022). Previously, I worked in sustainability consulting and charity management, studied environmental science, economics & IT and volunteered in lived and worked in Phnom Penh (Cambodia), Amsterdam and Berlin.
If you have any ideas for EA Berlin or would like to get involved, I'd be happy to hear from you! Just message me here on the forum or on Linkedin.
Feedback on me and my work is always welcome: bit.ly/ea_anonymous_feedback
Agree. Inviting at least one person from a major neartermist organisation in EA such as Charity Entrepreneurship would have been helpful, to represent all "non-longtermists" in EA.
(disclaimer: I'm friends with some CE staff but not affiliated with the org in any way, and I lean towards longtermism myself)
Also appreciate the transparency, thanks Bastian!
TLDR: Recent funding decline seems unfortunate (for EA Berlin specifically), slightly updated that relying on one major doner is riskier than I thought and that groups should maybe diversify funding sources more.
Thanks for sharing!
Some anecdotal data: From EA Berlin's perspective, the sudden decline in funding earlier this year seemed particularly unfortunate.
I understand why funders were hesitant to fund city groups after the sexual harassment cases that came out in February, but it still seemed unfortunate that CEA withdrew the previously committed funding for EA Berlin, especially as (to my knowledge) none of the sexual harassment cases happened in Berlin or Germany, and both German and international movement builders (CEA staff etc) have said they consider the German EA community particularly friendly and healthy (I think there's still room to improve but overall community health seems good, based on what I've seen).*
I've updated that fully relying on CEA for funding is riskier than I thought before, and diversifying funding should probably be a higher priority for city and national groups.
(I think it's fair to fund groups less when total funding volume decreased and there are strong cases to fund cause/career groups more and broad EA groups less. Still, having the 7th largest city EA group with 100s of engaged members run by volunteers only and not even fund a part-time position seems suboptimal)
*(I was told that CEA withdrew EA Berlin funding mainly because they downprioritised city groups after the sexual harassment cases. There might be more reasons I'm not aware of)
Disclaimer: I ran EA Berlin 2019-21, funded by a CEA grant. The 2023 funding withdrawal did not affect me personally as I stopped doing paid EA meta work last year and have since only supported and advised the group. The top candidates of the "EA Berlin Community Manager" hiring round where unexpectedly left without funding, though, and the position was canceled (for now))
Thanks for doing that survey and sharing it. This seems potentially quite helpful for EA meta workers & funders, much appreciated!
Minor question (feel free to ignore if busy): Results of EA surveys (both this one and others) seem to often be shared >6 months after the data was collected, and to me it feels at least somewhat outdated by then. Curious to hear if you have any thoughts on that and benefits/costs of sharing sooner.
TLDR: Agree with the risks, unsure if it's better to restrict flirting for "x weeks" or for "however long there's a power dynamic and/or vulnerability etc"
Thanks for sharing! Good to be reminded that this is a risk in any community (especially communities with both personal and professional relationships) and that others may have thought more about this and found better solutions.
This seems most risky in situations in which:
- Alice is new to EA (still in the "orientation phase") and looking for jobs.
- Bob is an experienced (paid or volunteer) EA group organiser, could help Alice professionally (knows people who might have jobs for Alice etc) and also finds Alice attractive.
The risk seems to come from a perceived [1]or actual power dynamic, and this is exacerbated if one person is somehow "vulnerable", like when they're new to the community and don't yet know the norms, or do not yet feel comfortable expressing boundaries or reaching out to other community members to ask for advice or support. @Severin Would you agree?
If that's the case, would it make sense to discourage/ban flirting not for x amount of weeks but for however long there's a (perceived or actual) power dynamic and/or "vulnerability"? This might then vary between one week (?) and forever.
Note that a perceived power difference is sufficient - if Alice thinks that Bob has the power to help or hinder her career (maybe because he mentioned he knows influential person x personally), that will make her hesitant to express boundaries when she receives unwanted attention from Bob, even if it's not actually true.
I feel like you're being unfair here.
The EA community engages quite a lot with its critics, more than the average movement. Of course reactions will differ in a community of ~10k active members, but the most upvoted posts & reactions I read are usually very empathetic and caring.
Also this post is about the Rationality community, and I think it'd be better to keep the discussion about that and not mix in EA community issues here (there are enough other posts about the EA community).
Fwiw: This might locally differ. The EA coworking office in Berlin is going strong and a second related space (Aurea) just opened. I think there's probably demand for more co-living spaces and maybe also more coworking spaces in the coming year, and they probably could be (mostly) funded by members, so not depending on donors.
PS: I wonder if you could do both raising awareness of "investing to give" and advertising your products in different posts, without one affecting the other? I think this might be worth a try. You could include disclaimers wherever relevant.
TLDR: I appreciate Marco, Sana & SageWealth, but do think that this is probably not a good place to advertise such products (based a.o. on feedback from their talk at EAGxBerlin)
Marco's co-founder Sana gave a talk at EAGxBerlin 2022, and as event lead, I read through all participant feedback. This was one of the more controversial sessions at EAGxBerlin- some (especially those new to investing) found it quite helpful, while others (including some with more investing expertise) were more critical for various reasons.
I don't know enough about SageWealth's products to have an informed opinion. Just from reading through the EAGx feedback and comments here, I do share the general sentiment that EA spaces like EAGx or this forum are probably not good places to advertise such products (unless you want to be totally truth-seeking and also include all the reasons against buying the product, alternative products from other companies etc).
I do agree that many EAs probably underestimate the benefits of investing their savings (rather than just keeping them sitting idle on a bank account), and I appreciate Marco & Sagewealth for bringing attention to this! I'm also glad to have Marco & Sana in the EA community - I've met them often on various events around Berlin, had many interesting conversations and always enjoyed their company :)
This is helpful, thanks!
I notice you didn't mention fundraising for AI safety.
Recently, many have mentioned that the funding bar for AI safety projects has increased quite a bit (especially for projects not based in the Bay and not already well connected to funders) and response times from funders such as EA Funds LTFF can be very long (median 2 months afaik), which suggests we should look for more additional funding sources such as new high net worth donors, governments, non-EA foundations etc.
Do you have any thoughts on that? How valuable does this seem to you compared to your ideas?