This is a special post for quick takes by Pedro Freire. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since:

Here are some bullet points of reflection topics around lifestyle and priorities for EAs that I shared with some fellow EAs some months ago. I am sharing this text here in case it interests anyone. I will elaborate and expand on them more and better later if I have the opportunity.

""" Support Systems: Seriously. I didn't even know this term until after all this happened, and it would have changed everything. There's something about how people are instructed in STEM institutions (and as a consequence, many EA institutions) that makes it all about careers, how one's impact is understood by their public professional life. And then it turns out that in reality a lot of the most publicly impactful people have these incredibly beautiful family and fraternity systems that were at the core of everything they've done, that never get talked about. Too many yang, public, external, wikipedia-worthy archetypes of impact. It would be really awesome if every youngling EA-in-training knew that having strong and abundant support systems, investing in true family and friends, investing in intimacy, figuring out relationships, being connected to non-EAs... that this sort of thing might be not a distraction from impact but a foundation for impact.

Something something about impact theory: I don't know, there's something about EA theory where it wants it to be really convincing that being an EA is the most important thing to do, but somewhere in all the moral arguments, it takes way too many shortcuts. By taking shortcuts to force it to be the case that being an EA is the right moral thing to do, you are forced to ignore and push under the rug all forms of impact that don't currently fit well into EA career stories and don't have a legible trace of impact connecting it to an EA. I don't really know how to solve this. If I were to give any pointers, here's what first comes to mind:

-- Legibility: there's a serious expectation that impact has to be legible. This is baked into the EA foundation. Unfortunately, in the real world, there are probably more illegible actions of impact than legible ones. Sure, I think we've been adding footnotes on EA material about this, but this is not a small thing that can be addessed separately from the rest of the decision-making. It truly affects the foundation on which the majority of EA arguments are based upon. One has to be able to make decisions in the world incorporating and accepting the fact that the majority of impact is fundamentally illegible, made by people you won't get to know personally, that sometimes public information and public consensus about events can be pretty irrelevant when it comes to understanding and planning on the ground.

-- Argumentation: there's an expectation that truth is found by finding the best arguments. This is true in all cases where this is true, except in all the cases where it isn't. This stems from the above; arguments rely on legible, shared-knowledge facts, and there's just so much of what decides what happens on a daily life basis that is far removed from that. Simplifications are incredibly robust in some cases, and incredibly illusory in others. Obviously we don't want to abandon arguments, but more like, grow beyond them.

-- Curiosity and connection: The majority of good human beings are not EAs! What are they all truly up to?"""

(My Facebook and Instagram accounts have been suspended without explanation. Hopefully they will be restored soon. If anyone reading this wants to reach me in the meantime, please use other means.)

Curated and popular this week
Relevant opportunities