353 karmaJoined


Sorted by New
· · 1m read


Topic contributions

Unfortunately I'm sick and bowing out, but the meetup is still on!

To help you find the group, at least one person should be wearing a shirt with the heart-in-lightbulb logo of effective altruism, and there should be a decent turnout (~8-10 people?) based on RSVPs from the various platforms we advertise the event. The group may be in the upstairs portion of the venue.

I agree that if you're already bought in to moral consideration for 10^umpteen future people, that's longtermism.

Yes. I think your list of commonsense priorities are even more beneficial in the view of longtermism. Factors like "would this have happened anyway, just a bit later" may still apply and reduce the impact of any given intervention. Then again, notions like "we can reach more of the universe the sooner we start expanding" could be an argument for sooner being better for economic growth.

The topics of working for an EA org and altruist careers are discussed occasionally in our local group. 

I wanted to share my rough thoughts and some relevant forum posts that I've compiled in this google doc. The main thesis is that it's really difficult to get a job at an EA org, as far as I know, and most people will have messier career paths.

Some of the posts I link in the doc, specifically around alternate career paths:

The career and the community

Consider a wider range of jobs, paths and problems if you want to improve the long-term future

My current impressions on career choice for longtermists

Answer by KevinO7

One takeaway, I think, is that these things which already seem good under common sense are much more important in the longtermist view.  For example, I think a longtermist would want extinction risk to be much lower than what you'd want from a commonsense view.

I believe that was discussed in the episode with Spencer. Search for 'threatened' in the transcript linked here.

00:22:30 Spencer Greenberg

And then the other thing that some people have claimed is that when Alameda had that original split up early on, where some people in the fact about trans community fled, that you had somehow threatened one of the people that had left. What? What was that all about?

00:22:47 Will MacAskill

Yeah. I mean, so yeah, it felt pretty.

00:22:50 Will MacAskill

This last when I read that because, yeah, certainly didn't have a memory of threatening anyone. And so yeah, I reached out to the person who it was about because it wasn't the person saying that they'd been friend. It was someone else saying that that person had been friend. So yeah, I reached out to them. So there was a conversation between me and that.

00:23:07 Will MacAskill

Person that was like kind of heated like.

00:23:09 Will MacAskill

But yeah, they don't think I was like intending to intimidate them or anything like that. And then it was also like in my memory, not about the Alameda blow up. It was like a.

00:23:18 Will MacAskill

Different issue.

Answer by KevinO10

Keeping Absolutes in Mind - I think donating money is still somewhat underrated in discussions like this, though I was happy to see it brought up in several comments.

Consider taking the GWWC pledge or TLYCS pledge (easier / more flexible) or some other pledge if you feel like that would help with keeping motivation up. 

You could also organize or contribute to a local group. Regular local group attendance could also keep motivation up (and would be a lot less costly for your budget).


Even a small donor can make a real impact for individuals directly, or help get small or new projects off the ground.


Answer by KevinO6

The Nonlinear Library podcast reads upvoted posts on the EA Forum, Lesswrong, and Alignment forum with an AI voice (that's not bad): Listen to more EA content with The Nonlinear Library

Load more