Hide table of contents

The EA Forum and LessWrong obviously share a lot of features, members, and topics of interest. I've recently started posting to both, and find myself often wondering what sorts of posts should be posted here, vs posted there, vs cross-posted to both. What are people's thoughts on that?

One specific subquestion is how often, and under what conditions, cross-posting to both is a good idea? It seems like cross-posting is relatively rare, even for great posts that seem like they'd be of interest to both forums' communities. This makes me worry that me regularly cross-posting would be spammy. But obviously anyone who saw the post on the other forum already can just not click, so maybe it's silly to be worried about being spammy?

And another specific subquestion is how much overlap there actually is between each site's members. Does anyone know, or have a vague sense of what the proportion might be?

I briefly searched for relevant prior discussion, but only really found this from 5 years ago:

I expect that at the new [EA] forum, as on the effective altruist Facebook and Reddit pages, people will want to discuss the which intellectual procedures to use to pick effective actions. I also expect some proposals of effective altruist projects, and offers of resources. So users of the new forum will share LessWrong's interest in instrumental and epistemic rationality. On the other hand, I expect that few of its users will want to discuss the technical aspects of artificial intelligence, anthropics or decision theory, and to the extent that they do so, they will want to do it at LessWrong.

14

0
0

Reactions

0
0
New Answer
New Comment

2 Answers sorted by

I'm not totally familiar with LW's content rules, but as for the Forum: You can post anything that follows our rules (don't be mean, don't hurt people, promote good discourse). 

At present, we don't categorize posts as "Frontpage" or "Community" unless they have some clear relevance to EA, but that can be as easy as taking your post about "how to think good" and adding a few sentences at the beginning to explain its relevance to one or more issues/areas/open questions within EA.

Maybe it's not the best answer, but what I've been doing is mostly posting to LW/AF and mostly only posting to EAF for things that are very strongly EA relevant, as in so relevant to EA I would have posted them to EAF if LW didn't exist. I don't have a consistent policy for cross-posting myself, other than that I only cross-post when it feels particularly likely that the content is strongly relevant to both communities independent of the shared aspects of the two sites' cultures.

Comments4
Sorted by Click to highlight new comments since:

Not an answer to your question, but I think it would be nice to have a consolidated comments thread for posts that are cross-posted to both forums. At the very least, it would be an informative experiment. I'm not sure how technically challenging this would be, but since the EA Forum is based on the LW codebase, I'd imagine it shouldn't be that difficult.

On the other hand, I expect that few of its users will want to discuss the technical aspects of artificial intelligence, anthropics or decision theory

It's fun to see how different the EA Forum (and maybe the community as a whole?) is from 6 years ago, since these days all three topics seem like fair game.

(In my specific case, I'm currently mostly writing about stuff to do with good decision-making practices - which seems more LessWrong-y - but in the context of moral uncertainty, or with lots of altruism-related examples - which seems more EA Forum-y. But I'm also curious about these where-to-post questions more generally.)

Moral uncertainty material definitely fits the EA Forum, and so do posts about applying general decision-making practices to altruism (we have lots of those on the Forum already). We even have a good number of posts written by people in the EA-sphere that would be equally at home on the Forum or LW (one example).

Curated and popular this week
Relevant opportunities