X

Xylix

Executive Director @ Effective Altruism Finland
46 karmaJoined Helsinki, Suomi

Bio

Participation
5

Aspiring rationalist. Trying to do my best to pay the world back for all the good it gives me.


EA Finland Executive Director starting in September 2025.

I'm curious about problems of consciousness, moral status, and technical AI safety. But currently focused on community building.

How others can help me

I'm looking for a mentor.

Also looking for effective ideas for local community growth.

Posts
1

Sorted by New

Comments
17

The role of guilt and perfectionism in EA, and how EA as an environment focused on efficiency and doing the very best we can can lead to difficult mental hangups, and be more demanding than more traditional ways of doing good. (Traditional ways of doing good are often more focused in feeling good and feeling altruistic, which is useful for the good-doers wellbeing but suboptimal for actual amount of good done.)

Positivity focused ethics! The inbalance between negativity-biased vs. positively biased versions of utilitarianism, and implications of this on evaluating policy ideas and the medium-term future.

Any hints / info on what to look for in a mentor / how to find one? (Specifically for community building.)

I'm starting as a national group director in september, and among my focus topics for EAG London are group-focused things like "figuring out pointers / out of the box ideas / well-working ideas we haven't tried yet for our future strategy", but also trying to find a mentor.

These were some thoughts I came up with when thinking about this yesterday:
 - I'm not looking for accountability or day to day support. I get that from inside our local group.
 - I am looking for someone that can take a description of the higher level situation and see different things than I can. Either due to perspective differences or being more experienced and skilled.
 - Also someone who can give me useful input on what skills to focus on building in the medium term.
 - Someone whose skills and experience I trust, and when they say "plan looks good" it gives me confidence, when I'm trying to do something that feels to me like a long shot / weird / difficult plan and I specifically need validation that it makes sense.

On a concrete level I'm looking for someone to have ~monthly 1-1 calls with and some asynchronous communication. Not about common day to day stuff but larger calls.
 

Same, I only had ~800 mana free but wouldn't have realized to donate it otherwise, and it only took a minute.

Regarding missing gears and old books, I have recently been thinking that many EAs (myself included) have a lot of philosophical / cultural blind spots regarding various things (one example might be postmodernist philosophy). It's really easy to developer a kind of confidence, with narratives like "I have already thought about philosophy a lot" (when it has been mostly engagement with other EAs and discussions facilitated on EA terms) or "I read a lot of philosophy" (when it's mostly EA books and EA-aligned / utilitarianist / longtermist papers and books).

I don't really know what the solutions for this are. On a personal level I think perhaps I need to read more old books or participate in reading circles where non-EA books are read.

I don't really have the understanding of liberalism to agree or disagree with EA being engaged with mainstream liberalism, but I would agree that EA as a movement has a pretty hefty "pro-status quo" bias in it's thinking, and especially in it's action quite often. (There is an interesting contradiction here in EA views often being pretty anti-mainstream though, like thought on AI x-risks, longtermism and wild animal welfare.)

FWIW I don't know why you're being disagreement voted, I broadly agree. I think the money amounts at play here are enough to warrant an investigation even with a low possibility of uncovering something significant.

I disagree with paying back being obviously the right thing to do. The implications of "pulling back" money whenever something large shady appears would be difficult to handle, and it would be costly. (If you are arguing that the current case is special and in future cases of alleged / proven financial crime we should evaluate case by case then I am very interested in what the specific argument is.)

I would look into options for vetting integrity of big donors in the future as the right thing to do though.

Another approach could be to be more proactive in taking funding assets in advance and liquidating and holding them in fiat (or other stable) currency. (e.g. ask big highly EA sympatethic donors to fund very long periods of funding at once if in any way possible.)

Altough your argument may make a more convincing case for the funders to fund, since the money will actually be spent quickly.

Polymarket question about will Binance cancel the FTX bailout deal: https://polymarket.com/market/will-binance-pull-out-of-their-ftx-deal (The question is in reverse phrasing related to some other markets.)

Load more