Jeremy

643Vermont, USAJoined Dec 2018

Comments
143

Topic Contributions
4

Credit this post for bringing the LW post to my attention.

To answer my own question, in case someone ends up here in the future, wondering the same thing, there are some options to do this. 

I dug a bit deeper and tried a few out. Updating the original post with details.

This is super cool, thanks! If I'm not mistaken, I don't see anything about excluding tags in there. That would probably be somewhat too lengthy for the query string anyway. 

I realized my "posts below X karma" idea wasn't particularly coherent actually, because every post starts with low karma, so, depending on how often my reader checks (and cache length) it would potentially just show all posts. 

LOL. I wonder how much of this is the red-teaming contest. While I see the value in it, the forum will be a lot more readable once that and the cause exploration contest are over. 

Sorry for the delay. Yes this seems like the crux.

It would be very surprising if there weren’t others who are in a similar boat, except being somewhat more averse to longtermism and somewhat less appreciative of the rest of the EA, the balance swings the other way and they avoid the movement altogether. 

As you pointed out, there's not much evidence either way. Your intuitions tell you that there must be a lot of these people, but mine say the opposite. If someone likes the Givewell recommendations, for example, but is averse to longtermism and less appreciative of the other aspects of EA, I don't see why they wouldn't just use Givewell for their charity recommendations and ignore the rest, rather than avoiding the movement altogether. If these people are indeed "less appreciative of the rest of EA", they don't seem likely to contribute much to a hypothetical EA sans longtermism either.

Further, it seems to me that renaming/dividing up the community is a huge endeavor, with lots of costs. Not the kind of thing one should undertake without pretty good evidence that it is going to be worth it.

One last point, for those of us who have bought in to the longtermist/x-risk stuff, there is the added benefit that many people who come to EA for effective giving, etc. (including many of the movement's founders) eventually do come around on those ideas. If you aren't convinced, you probably see that as somewhere on the scale of negative to neutral. 

All that said, I don't see why your chapter at Microsoft has to have Effective Altruism in the name. It could just as easily be called Effective Giving if that's what you'd like it to focus on. It could emphasize that many of the arguments/evidence for it come from EA, but EA is something broader. 

There’s not a way to filter stuff from the RSS fee is there? Doesn’t seem like it, but maybe I missed something.

I guess we can swap anecdotes. I came to EA for the Givewell top charities, a bit after that Vox article was written. It took me several years to come around on the longtermism/x-risk stuff, but I never felt duped or bait-and-switched. Cause neutrality is a super important part of EA to me and I think that naturally leads to exploring the weirder/more unconventional ideas. 

Using terms like dupe and bait and switch also implies that something has been taken away, which is clearly not the case. There is a lot of longtermist/x-risk content these days, but there is still plenty going on with donations and global poverty. More money than ever is being moved to Givewell top charities (don't have the time to look it up, but I would be surprised if the same wasn't also true of EA animal welfare) and (from memory) the last EA survey showed a majority of EAs consider global health and wellbeing their top cause area.

I hadn't heard the "rounding error" comment before (and don't agree with it), but before I read the article, I was expecting that the author would have made that claim, and was a bit surprised he was just reporting having heard it from "multiple attendees" at EAG - no more context than that. The article gets more mileage out of that anonymous quote than really seems warranted - the whole thing left me with a bit of a clickbait-y/icky feeling. FWIW, the author also now says about it, "I was wrong, and I was wrong for a silly reason..."

In any case, I am glad your partner is happy with their charity contributions. If that's what they get out of EA, I wouldn't at all consider that being filtered out. Their donations are doing a lot of good! I think many  come to EA and stop with that, and that's fine. Some, like me, may eventually come around on ideas they didn't initially find convincing. To me that seems like exactly how it should work. 

That didn’t come off as clearly as I had hoped. What I meant was that maybe the leading with X-Risk will resonate for some and Longtermism for others. It seems worth having separate groups that focus on both to appeal to both types of people.

This sounds like a great idea. Maybe the answer to the pitching Longtermism or pitching x-risk question is both? 

Load More