Tl;dr: I created an example discussion space on Kialo for claims/ideas about EA community building, with the idea being that community builders could share ideas and arguments via such structured discussion platforms. This could include monthly "epistemic jam sessions." Does this seem like something that could be valuable? Are there better alternatives?

 

Intro

While reading this post early last month I wondered: would it be helpful if there were some kind of virtual space for sharing claims and ideas about community organizing in a structured way—as well as arguments for/against those claims and ideas?

This could include claims or recommendations such as:

  • “University organizers should be prepared to directly appeal to existential/catastrophic risk reduction before trying to push longtermism or even EA (as opposed to following the more-traditional pitch/pathway of ‘EA => longtermism => existential risk reduction’)”
  • “University organizers should prioritize running fellowships that require applications over open-to-anyone reading groups and other ad-hoc events.”
  • “Intro groups/fellowships at universities should not have "required readings"


 

Toy example on Kialo

I went ahead and created a toy example of what such a structured discussion could look like on Kialo, with some example claims and a few arguments for/against those claims (although nothing in as much depth as I would expect); I’d be very interested to get people’s initial impressions.


 

Briefly: why I like structured discussion (e.g., Kialo)

In short, my support for platforms/methods such as Kialo is largely driven by how I believe they do a better job of aggregating, summarizing, and organizing claims and arguments from a group of people compared to e.g., comment sections with their messier linear nesting system and lack of emphasis on “one block/thread = one claim/argument” (or similar forms of concept itemization).

Thus, I suspect that structured discussions would make it (1) easier for readers to discover relevant ideas/considerations and (2) easier for observers to contribute (in part because of the benefits from (1) but also because it is easier or more acceptable to make one-off contributions to a specific portion of the debate).

It is also possible provide a sense of community/crowd evaluation and argument importance through voting (similar to karma systems but with more granularity since you can see the distribution of 0–4 ratings rather than just the summary statistic), but voting on Kialo can be turned off or restricted to certain participants if people deemed it to be unhelpful.


 

Optionally: “Epistemic jam sessions”?

Building on this idea, I also wondered if it might be good to designate some 1–3 day period each month as a focal/Schelling point for community organizer participation. This could also come with some non-binding/optional goals laid out in advance (e.g., “multiple organizers have sought to get a better sense of how to improve outreach/success at lower-prestige universities,” “the head organizers at X and Y universities want to hear about others’ experiences/advice regarding outreach to STEM groups”), although this would not be necessary. Perhaps you could call these “epistemic jam sessions”—although I’m totally open to accepting better name ideas. Regardless, these discussions would presumably be open to contributions at any time, it’s just that certain days in a month and/or topics could be emphasized to improve coordination.


 

“Wouldn’t this require moderators”?

The short answer is “I would recommend having moderators, but this shouldn’t be a serious constraint”: beyond obvious spam/trolling moderation, moderators on Kialo play an important role in organizing arguments (e.g., grouping similar arguments), removing or merging duplicative arguments, cross-linking arguments where applicable, flagging claims that need clarification or support/sources, etc. 

Once people become familiar with Kialo the amount of moderation and structure requirements should decrease on a per-participant basis, but there will still tend to be new people and opportunities for optimization. In fact, one of the additional strengths (in my view) of Kialo is the partial division of labor between “people who know a lot about the subject” and “moderators who know how to organize/structure contributions (and remove duplicates).” (Consider for comparison the funded contest for AI safety argument “distillation”

Ultimately, I would have to think more deeply to give an estimate I’m more comfortable with, but in broad terms, I definitely don’t expect that moderation constraints would be a major issue. I would happily spend at least 5–10 hours a week[1] moderating if I thought such work was valuable, and depending on the quantity and quality[2] of the crowd contributions, 3–15 hours of moderation per week should be sufficient.[3]


 

Conclusion/questions

Ultimately, platforms/formats such as Kialo tend to strike me as a potential way of meaningfully improving discussions between groups of people relative to traditional formats such as forum posts and their comment sections. Additionally, there seems to be some decent agreement that community building is really important (especially now while EA faces talent constraints in areas such as AI safety). Thus, I’d love to hear your thoughts:

  1. Is there actually a problem that needs solving when it comes to improving idea sharing and discussion among community organizers?
    • What tools/approaches are already being used to engage in such discussions? (Is it primarily just informal media like the EA Forum, Slack channels, and direct conversations?)
  2. Does something like Kialo seem like it would be effective for improving idea sharing and discussion among community organizers?
    • Do you have any recommendations for alternative platforms or methods (i.e., that are not currently in use, aside from Kialo)?
  3. Does something like this valuable for discussions outside of community organizing (e.g., AI safety/governance debates)?
  1. ^

    Caveat: this would be subject to change depending on my combined employment and academic enrollment status, which currently is well below full-time equivalent. I would probably still happily contribute at least 5 hours if my status changed to full-time equivalent, provided that I viewed my contributions as helpful.

  2. ^

    In hindsight, I think that a substantial portion of my time and frustration in moderating on Kialo in college was from a few disruptive trolls/conspiratorialists, perhaps even following something like a power law or 80-20 distribution (% of time spent on % of participants). Crucially, I think that the EA community will probably have far fewer of these people.

  3. ^

    I do strongly recommend, however, that the moderation is done via multiple people, so that there is some diversity of thought and the ability to get second or third opinions on some decisions. For example, there could be 2 head moderators who each spend ~3 hours per week moderating, and ~3 assistant moderators that each spend 1–2 hours per week moderating.

8

New Answer
Ask Related Question
New Comment

1 Answers

I just finished this synthesis discussion matrix for 6x6x6 participants. There should be questions and resources one 'cannot go wrong' with, in combination with each other and EA-related background (specified for each resource/question). It can help with elevator pitch development that can be overall likeable (everyone will contribute to a discussion of a group of 6, who will select a representative of another group of 6, that will select one of 6 representatives who will answer a broad question; the last discussion can be attended by all participants). Maybe you can be inspired.

What do you mean by curated? Preventing poor messaging by practice? How would you make this practice relevant to the environment of the person, though - EA pitch can emphasize various aspects to be best received by some (while maintaining high fidelity to the fundamentals)?

To offer a perspective on your questions:

  1. No, there is no need to improve the epistemics of the organizers or organize knowledge sharing directly, because community organizers have to carry the persona of really caring about their group, 'making sure it is the best one,' and not really just 'employing members for overall efficiency purposes.' Plus, there is not so much crucial considerations to discuss regarding community building, more so 'what types of jokes do your members like - you should use similar jokes' to write expressively. - I think there is regional organization and channels where maybe organizers could be seen as trying to impress Community Building Grant (and similar grant) funders. This implies limited international cooperation, relatively low involvement of organizers in thought development, and lack of structure and incentives for specialization and coordination. So, actually, yes, there are problems that need solving (but I may be biased).
  2. Something like Notion? Airtable? Something that allows you to group ideas and see more detail if you want. Unless you say that most important is if people meet and discuss, that when you type you do not reach agreements and decide on further cooperation nearly as much - then, Gather (gather.town) or other virtual place discussion platforms can be used. Notion is used for the in-depth program, Airtable for some forms by some groups, and Gather by SERI (and possibly others) - the LessWrong garden was cool.
  3. This is so cool, probably much better than Notion etc - the art there would be to keep sound epistemics while allowing for independent ideation. Maybe, clusters of users discuss but from time to time, they reconvene for a reflection on 'are we doing the most good with our spare resources' 'are we using reason, not emotion' 'do we have the Scout mindset' 'do we have evidence' 'how can we mitigate our biases' etc - but I am feeling that I am not coming up with anything too innovative. Rather than users, I would be networking memes (who may network with users), because it allows for better thought development?

I’m not sure how I never saw this response (perhaps I saw the notification but forgot to read), but thank you for the response!

I’m not familiar with the 6x6x6 synthesis; would it not require 216 participants, though? (That seems quite demanding) Or am I misunderstanding? (Also, the whole 666 thing might not make for the best optics in light of e.g., cult accusations, lol)

I’m not sure what you’re referring to regarding “curated,” but if you’re referring to the collection of ideas/claims on something like Kialo I think my point was just that you can have moderators filter out the ideas that seem clearly uninteresting/bad, duplicative, etc.

1brb2439d
ok - yes, it is 5^3 (if you exclude a 'facilitator') .. yes, although some events are for even more people. Hm .. but filtering can be biasing/limiting innovation and motivating by fear rather than support (further limiting critical thinking)? .. this is why overall brainstorming while keeping in mind EA-related ideas can be better (even initial ideas (e. g. even those that are not cost-effective!) can be valuable, because they support the development of more optimal ideas) - 'curation' should be exercised as a form of internal complaint (e. g. if someone's responsiveness to feedback is limited - 'others are offering more cost-effective solutions and they are not engaging in a dialogue') - this could be prevented by great built-in feedback mechanism infrastructure (and addressed by some expert evaluation of ideas, such as via EA Funds, that already exist). duplicative ideas should be identified - even complementary ideas. Then, people can 1) stop developing ideas that others have already developed and do something else, 2) work with others to develop these ideas further, 3) work with others with similar ideas on projects.
1Harrison Durland9d
re: "filtering", I really was only talking about "clearly uninteresting/bad" claims—i.e., things that almost no reasonable person would take seriously even before reading counterarguments. I can't think of many great examples off the top of my head—and in fact it might rarely ever require such moderation among most EAs—but perhaps one example may be conspiracy-theory claims like "Our lizard overlords will forever prevent AGI..." or non-sequiturs like "The color of the sky reflects a human passion for knowledge and discovery, and this love of knowledge can never be instilled in a machine that does not already understand the color blue." In contrast, I do think it would probably be a good idea to allow heterodox claims like "AGI/human-level artificial intelligence will never be possible"—especially since such claims would likely be well-rebutted and thus downvoted. Yes, de-duplication is a major reason why I support using these kinds of platforms: it just seems so wasteful to me that there are people out there who have probably done research on questions of interest to other people but their findings are either not public or not easy to find for someone doing research.
1brb2438d
yes, that is the thing - the culture in EA is key - overall great intentions, cooperation, responsiveness to feedback, etc (alongside with EA principles) - can go long way - well, ok, it can be also training in developing good ideas by building on the ongoing discourse: 'you mean like if animals with relatively limited (apparent) cognitive capacity are in power then AGI can never develop?' or 'well machines do not need to love knowledge, they can feel indifferent or dislike it. plus, machines do not need to recognize blue to achieve their objectives' - this advances some thinking. the quality of arguments, including those about crucial considerations, should be assessed on their merit of contributing to good idea development (impartially welfarist, unless something better is developed?). yes but the de-duplication is a real issue. with the current system, it seems to me that there are people thinking in very similar ways about doing the most good so it is very inefficient