Markus Amalthea Magnuson

Founder @ Altruistic Agency
Working (15+ years of experience)


How others can help me

I'm looking for ambitious funding for several EA organisations in the tech and governance spaces.

How I can help others

I'm an experienced software engineer with a focus on full-stack web development and have worked with websites for 15+ years. Anyone who needs anything related to that are welcome to chat, and it's also what my organisation Altruistic Agency offers for free.


For the same reason that e.g. net electricity generation from fusion power is not the "number one single factor debated in every single argument on any economic/political topic with medium-length scope": Until it exists, it is fictional – why should everyone focus so much on fictional technology? It remains a narrow, academic field. The difference is that there is actual progress towards fusion.

I don’t have a view on that, but it would be cool if it was available as a forum setting (”Weight votes by account age”) and some people might like it better that way.

I plan to post my reports on LessWrong and the Effective Altruism forum

Why would posting mainly in these tiny communities be the best approach? First, I think these communities are already far more familiar with the topics you plan to publish on than the average reader. Second, they are – as I said – tiny. If you want to be a public intellectual, I think you should publish where public intellectuals generally publish. This is usually a combination of books, magazines, journals, and your own platforms (e.g. personal website/blog, social media etc.)

You could probably improve on your plan by making a much more in-depth analysis of what your exact goals are and what your exact audiences are. It seems to me a few steps are missing in this statement:

I believe such people provide considerable value to the world (and specifically to the project of improving the world).

What would probably be useful is, in a sense, a theory of change on how doing the things you want to do lead to the outcomes you want.

If you do decide to go ahead with this plan, I would also focus a lot on this part:

In contrast, I am quite below average on conscientiousness and related traits like diligence, perseverance, willpower, "work ethic", etc.

You are going to need those in the massively competitive landscape you aim for.

If you speak to a stranger about your worries of unaligned AI, they'll think you're insane (and watch too many sci-fi films).

I'm not so sure this is true. In my own experience, a correct explanation of the problem with unaligned AI makes sense to almost everyone with some minimum of reasoning skill. Although this is anecdotal, I would not be surprised if an actual survey among "strangers" would show this too.

Commenting on your general point, I think the reason is that most people's sense of when AGI could plausibly happen is "in the far future", which makes it psychologically unremarkable at this point.

Something like extinction (even if literal extinction is unlikely) from climate change, although possibly further off in time, might feel closer to a lot of people because climate change is already killing people.

You would essentially be a freelancer. Using that framing instead, there are plenty of resources out there on how to build a life as a freelancer. For an EA-specific perspective, here’s a good starting point:

This is very exciting and has huge potential. Please get in touch with the Altruistic Agency for tech needs (e.g. website) when you are at that point, I'd love to help.

Since sociology is probably an underrepresented degree in effective altruism, maybe you can consider it a comparative advantage rather than "the wrong degree". The way I see it, EA could use a lot more sociological inquiry.

I'm aware of this, and it raises more questions than it answers, frankly. For example, I wonder what the terms were when what was originally a grant to a non-profit, turned into (?) an investment in a for-profit.

Load More