Hide table of contents

The EA Forum Team is working on building subforums. Our vision is to build more specific spaces for discussion of topics related to EA, while still allowing content to bubble up to the Frontpage.

The user problem we are thinking of targeting is: As someone who cares about X, I find it hard to find posts I’m interested in, and I don’t know if the EA Forum is a useful place for me to post ideas or keep up with X.

Would solving this be useful for you? If so, what spaces, topics, or causes would you be interested in seeing such discussion spaces for?

 

Note: if you've seen our current bioethics and software engineering subforums, note that these are not the final state of the feature, and we are still designing and iterating.

29

0
0

Reactions

0
0
New Answer
New Comment

2 Answers sorted by

Government, policy, foreign affairs. 

I need to ask questions and introduce topics related to AI policy without needing to worry about risks from uninformed people misinterpreting, or even misusing, the information. This is a very serious problem on EAforum and Lesswrong.

Yes, and I would love a meta-research/rationality/socioepistemology subforum!

It would be usefwl because it might get easier to find and talk to the  people who are especially interested in the topic. And because posts on the frontpage gets eaten by the time window in a swish and a swosh, it rarely leads to lasting discussion and people feel like they have to hurry (which is partly positive, partly negative).

The tag system would work just the same if people used it (I would actually prefer it), but since tag usage is inconsistent, people can't rely on it.[1]

  1. ^

    Solutions to inconsistent tag usage could be to prevent people from clicking the "post" button before they've added at least one tag, or paying someone to manually tag posts, or making an AI help you do it, etc.

(There are more arguments why subforums would be cool, but I shouldn't be spending time elucidating them rn unless anyone's curious >.<)

Comments3
Sorted by Click to highlight new comments since: Today at 8:22 AM

The obvious idea here is to have different subforums grouped by cause area -- AI risk, global health/development, animal welfare, etc.  I agree that this is probably the best way to dice things up.  But here are some other options that might be worth considering:

  • Separate subforums for "news events, announcements, and org updates" (posts about things that are actually happening in the world, like an EA newspaper), versus "academic-paper style research" (a fancy dignified arXiv-like subforum for academics and advanced discussion, like the AI Alignment Forum versus LessWrong), versus more casual & exploratory online discussion and community stuff (like LessWrong versus the Alignment Forum).  Although you'd want to be careful about this -- perhaps right now, the online discussion and academic research is "cross-subsidizing" the announcements and org updates, such that nobody would read the announcements and org updates if they were split off into their own thing.  (Or vice versa!  My point is that for all subforum ideas, it is important to understand the dynamics here and what is "cross-subsidizing" what, in an attention-economy sense.)
  • Taking another page out of the Alignment Forum's playbook, consider just going fully elitist and creating a subforum for 500+ karma users or something, plus maybe include the option to create private posts not publicly viewable if you are not logged in.  There are several potential goals here:
    • To separate "EA 101" discussion, where we want to have a welcoming attitude for newcomers, from a space for more advanced discussion and debate.  (See, eg, me making an overly-harsh comment recently because I didn't realize that a user was new to the Forum!)
    • To sequester community drama in a place where it won't waste the attention of people who aren't heavily involved in the community (similar to how the "community" tag is currently used to down-weight some posts).
    • Most importantly, to make a play for capturing some of the energy that goes into the vast ecosystem of privately-shared google docs... creating a semi-public/semi-private forum for highly engaged EAs (or highly engaged EAs within a certain cause area, or etc) could offer a middle ground between posting on the normal Forum versus sharing a doc through a bunch of personal social connections.
  • Create a Spanish- or German-language subforum, to help assist the growth of EA communities in other regions of the world.  Naturally this would start out small, with not very frequent posting.  But that is okay -- the EA Forum itself was once small!
  • Finally, re-upping an idea of mine  from the Dank EA Memes facebook group:

Thank you for your take, I very much appreciate the tradeoffs of any direction we split things.

If going for subforums, I would endorse ensuring posts can be cross-posted and their comments are properly shared across the relevant subforums, or allowing such functionality as an option for the author.

Maybe you can rely on tags for this, although ensure author-added tags can only be removed by mods? I guess there's also a question of whether you want authors to be able to easily prevent their posts from being tagged and added to other subforums they don't want them to be in. And maybe they'll want to keep discussions separate in some cases even if posted across multiple subforums.

Curated and popular this week
Relevant opportunities