All posts

New & upvoted

June 2022

Building effective altruism 40
Community 35
Criticism and Red Teaming Contest 16
Existential risk 16
Global health & development 14
Cause prioritization 14
More

Frontpage Posts

168
· 2y ago · 1m read
162
· 2y ago · 7m read
146
Joey
· 2y ago · 6m read
140
Jacy
· 2y ago · 38m read
127
103
· 2y ago · 6m read
90
Lizka
· 2y ago · 16m read
76
· 2y ago · 8m read
73
· 2y ago · 1m read
69
Akash
· 2y ago · 3m read

Quick takes

No, longtermism is not redundant I’m not keen on the recent trend of arguments that persuading people of longtermism is unnecessary, or even counterproductive, for encouraging them to work on certain cause areas (e.g., here, here). This is for a few reasons: * It’s not enough to believe that extinction risks within our lifetimes are high, and that extinction would constitute a significant moral problem purely on the grounds of harms to existing beings. Arguments for the tractability of reducing those risks, sufficient to outweigh the nearterm good done by focusing on global human health or animal welfare, seem lacking in the arguments I’ve seen for prioritizing extinction risk reduction on non-longtermist grounds. * Take the AI alignment problem as one example (among the possible extinction risks, I’m most familiar with this one). I think it’s plausible that the collective efforts of alignment researchers and people working on governance will prevent extinction, though I’m not prepared to put a number on this. But as far as I’ve seen, there haven’t been compelling cost-effectiveness estimates suggesting that the marginal dollar or work-hour invested in alignment is competitive with GiveWell charities or interventions against factory farming, from a purely neartermist perspective. (Shulman discusses this in this interview, but without specifics about tractability that I would find persuasive.) * More importantly, not all longtermist cause areas are risks that would befall currently existing beings. MacAskill discusses this a bit here, including the importance of shaping the values of the future rather than (I would say “complacently”) supposing things will converge towards a utopia by default. Near-term extinction risks do seem likely to be the most time-sensitive thing that non-downside-focused longtermists would want to prioritize. But again, tractability makes a difference, and for those who are downside-focused, there simply isn’t this convenient convergence between near- and long-term interventions. As far I can tell, s-risks affecting beings in the near future fortunately seem highly unlikely.
I think it's possible there's too much promotion on the EA Forum these days. There are lots of posts announcing new organizations, hiring rounds, events, or opportunities. These are useful but not that informative, and they take up space on the frontpage. I'd rather see more posts about research, cause prioritization, critiques and redteams, and analysis. Perhaps promotional posts should be collected into a megathread, the way we do with hiring. In general it feels like the signal-to-noise ration on the frontpage is lower now than it was a year ago, though I could be wrong. One metric might be number of comments - right now, 5/12 posts I see on the frontpage have 0 comments, and 11/12 have 10 comments or fewer.
34
Pablo
2y
1
PSA: Only about 4% of the world's population uses the date format month/day/year.
About going to a hub A response to: https://forum.effectivealtruism.org/posts/M5GoKkWtBKEGMCFHn/what-s-the-theory-of-change-of-come-to-the-bay-over-the For people who consider taking or end up taking this advice, some things I'd say if we were having a 1:1 coffee about it: * Being away from home is by its nature intense, this community and the philosophy is intense, and some social dynamics here are unusual, I want you to go in with some sense of the landscape so you can make informed decisions about how to engage. * The culture here is full of energy and ambition and truth telling. That's really awesome, but it can be a tricky adjustment. In some spaces, you'll hear a lot of frank discussion of talent and fit (e.g. people might dissuade you from starting a project not because the project is a bad idea but because they don't think you're a good fit for it). Grounding in your own self worth (and your own inside views) will probably be really important. * People both are and seem really smart. It's easy to just believe them when they say things. Remember to flag for yourself things you've just heard versus things you've discussed at length  vs things you've really thought about yourself. Try to ask questions about the gears of people's models, ask for credences and cruxes.  Remember that people disagree, including about very big questions. Notice the difference between people's offhand hot takes and their areas of expertise. We want you to be someone who can disagree with high status people, who can think for themselves, who is in touch with reality. * I'd recommend staying grounded with friends/connections/family outside the EA space. Making friends over the summer is great, and some of them may be deep connections you can rely on, but as with all new friends and people, you don't have as much evidence about how those connections will develop over time or with any shifts in your relationships or situations. It's easy to get really attached and connected to people in the new space, and that might be great, but I'd keep track of your level of emotional dependency on them. * We use the word "community" but I wouldn't go in assuming that if you come on your own you'll find a waiting, welcoming pre -made social scene, or that people will have the capacity to proactively take you under their wing, look out for you and your well being, especially if there are lots of people in a similar boat. I don't want you to feel like you've been promised anything in particular here. That might be up to you to make for yourself. * One thing that's intense is the way that the personal and professional networks overlap, so keep that in mind as you think about how you might keep your head on straight and what support you might need if your job situation changes, you have a bad roommate experience, you date and break up with someone (maybe get a friend's take on the EV of casual hookups or dating during this intense time, given that the emotional effects might last a while and play out in your professional life - you know yourself best and how that might play out for you). * This might be a good place to flag that just because people are EAs doesn't mean they're automatically nice or trustworthy, pay attention to your own sense of how to interact with strangers. * I'd recommend reading this post on power dynamics in EA. * Read CS Lewis 's The Inner Ring * Feeling lonely or ungrounded or uncertain is normal. There is lots of discussion on the forum about people feeling this way and what they've done about it. There is an EA peer support Facebook group where you can post anonymously if you want. If you're in more need than that, you can contact Julia Wise or Catherine Low on the community health team. * As per my other comment, some of this networking is constrained by capacity. Similarly, I wouldn't go in assuming you'll find a mentor or office space or all the networking you want. By all means ask, but also also give affordance for people to say no, respect their time and professional spaces and norms. Given the capacity constraints, I wouldn't be surprised if weird status or competitive dynamics formed, even within people in a similar cohort. That can be hard. * Status stuff in general is likely to come up; there's just a ton of the ingredients for feeling like you need to be in the room with the shiniest people and impress them. That seems really hard; be gentle with yourself if it comes up. On the other hand, that would be great to avoid, which I think happens via emotional grounding, cultivating the ability to figure out what you believe even if high status people disagree and keeping your eye on the ball. * This comment and this post and even many other things you can read are not all the possible information, this is a community with illegibility like any other, people all theoretically interacting with the same space might have really different experiences. See what ways of navigating it work for you, if you're unsure, treat it as an experiment. * Keep your eye on the ball. Remember that the goal is to make incredible things happen and help save the world. Keep in touch with your actual goals, maybe by making a plan in advance of what a great time in the Bay would like, what would count as a success and what wouldn't. Maybe ask friends to check in with you about how that's going. * My guess is that having or finding projects and working hard on them or on developing skills will be a better bet for happiness and impact than a more "just hang around and network" approach (unless you approach that as a project - trying to create and develop models of community building, testing hypotheses empirically, etc). If you find that you're not skilling up as much as you'd like, or not getting out of the Bay what you'd hoped, figure out where your impact lies and do that. If you find that the Bay has social dynamics and norms that are making you unhappy and it's limiting your ability to work, take care of yourself and safeguard the impact you'll have over the course of your life. We all want (I claim) EA to be a high trust, truth-seeking, impact-oriented professional community and social space. Help it be those things. Blurt truth (but be mostly nice), have integrity, try to avoid status and social games, make shit happen.