All of Severin's Comments + Replies

I don't understand how this is relevant to what I'm writing, as I don't intend to do mediation only for people who know AR or circling. But the number of upvotes indicates that others do understand, so I'd like to understand it, too. Jeroen, would you mind elaborating?

2
Jeroen Willems
8mo
Some people might not be a fan of AR or circling, so other methods of mediation should be considered too.

That's an excellent question!

For organization-internal mediations, I guess that's not a problem, because everyone within the org has an interest in the process going well?

One version for grievances between orgs/community members I could think of: Having an EA fund or E2Ger pay all my gigs so I can offer them pro bono and have no financial incentives to botch the outcome.

Plus, I'll definitely want to build a non-EA source of income so that I'm not entirely financially dependent on EA.

Where do you see gaps in these ideas?

6
Jason
8mo
I'm assuming a somewhat looser standard than the norms for mediators generally, in light of the parties' presumed interest in an EA-associated mediator. However, in my view, the conflict standards for third-party neutrals are significantly higher than just about any other role type, and rightfully so. I think having an E2Ger as benefactor is probably the best practicable answer to conflicts, although you would inherit all the conflicts any major benefactor.  I would probably not try to mediate any matter in which a reasonable person might reasonably question the impartiality of any major (over 10-20%??) funder of your work. Hopefully, you could find a E2Ger without many conflicts? If you're dependent on a fund for more than 10-20%, I think that conflict would extend to all the fund managers in a position to vote on your grants, and the organizations that employ them. So taking money from a fund would probably preclude you from working on matters involving many of the major organizations. In my view, a reasonable person could question whether a mediator could be impartial toward Org X when someone from Org X had a vote on whether to renew one of your major grants [or a vote on a major grant you intended to submit]. Some of that is potentially waivable where both parties to the dispute have approximately equal power, but I do not think it would be appropriate to waive the potential appearance of influence where a significant power imbalance existed in favor of the funder. One challenge you'll want to think about is how to demonstrate your effectiveness to your funder(s) while maintaining confidentiality of the parties (unless you obtain a waiver from them to disclose information to the funder(s)).

Perhaps another consideration against is that it seems potentially bad to me for any one person to be the primary mediator for the EA community. There are some worlds where this position is subtly very influential. I dont think I would want a single person/worldview to have that, in order to avoid systematic mistakes/biases.

Well, good that my values are totally in line with the correct trajectory for EA then!

No, but seriously: I have no idea how to fix this. The best response I can give is: I'd suspect that having one mediator is probably still better than... (read more)

Huh, sounds plausible. At the same time, it has me wonder whether EA should imitate the corporate world less here. Wouldn't "Would it be high EV to have an EA insider with competence in this?" be a more relevant question than "Is this something that's already common and generally useful in the non-EA world?"

I guess the heuristic you point at is for avoiding vultures?

What would be cheap tests to determine if this would be valuable? 

Good prompt, thanks!

Mediation is a high risk/high reward activity, and I'd only want to work with EA orgs when I'm already sure that I can consistently deliver very high quality. So I started advertising mediation to private people on pay-what-you-want-basis now to build the necessary skill and confidence. If this works out, I'll progress to NGOs in a couple weeks.

The AuthRev and Relating Languages links look like nonsense to me.

I wince every time when I look at their homepages, way too... (read more)

4
Jeroen Willems
8mo
There are still a lot of young EAs that aren't into AuthRev and circling, so I think as a mediator it's important to take this into account.

Agreed!

Agree with everything.

Your friend sounds delightful! I think actually, what I'm trying to point towards here is closer to "lifestyle anarchism" than classic virtue ethics. Coincidentally, I found myself defaulting back to explaining my values in anarchist terms when I announced my career transition from active EA community builder to baby influencer in my first blog post.

I guess it's no coincidence that Rocky's "on living without idols" is my all-time favorite on the EA forum.

4
quinn
10mo
Someone in discord asked about local volunteering opportunities and I said "idk for meatspace stuff impact isn't really the point" and reiterated my comment to this post and mentioned fuzzies budgeting, then wrote the following:  I generally endorse any way of sampling from the population that disproportionately puts me in the room with sincere nonnihilistic noncynical nondefeatist people who have their heart in the right place, because it's at least plausible that the "hey, I have an action space here!" mental circuit is more important than how much they'd like to measure/multiply/maximize. The people you meet (at say food not bombs) will vary in their sympathies to the three Ms, some of them just haven't gotten the right invitation or haven't dedicated enough scrutiny to it yet and others will never in any circumstance get into it. But almost all of them will have observed something broken and decided not to seethe and cope about it because they were too busy rolling up their sleeves. That makes them precious to me.

Thanks! I'm still grappling with putting the intuitions behind this post into words, so this is valuable feedback.

Personally, my heuristic in the example you describe is rolling with what I feel like. Considerations that go into that are:

1. Will it kill me? (I'm allergic to red meat)
2. Would I be actively disgusted eating it? (The case for most if not all non-vegetarian stuff.)
3. Do I lack the spoons to have a debate about this, given which amount of pushback/disappointment I expect from the host?

...and when all of them get a "no":
4. Do I feel like my nutr... (read more)

awesome, looks good!

Oh dear. Well, there goes that bit of evidence out the window.

Strongly agree!

Actually, the seeds for a bunch of my current knowledge about and approach to community building were sown during various unconferences over the years.

The 2020 Unconference was my first in-person encounter with EA. After my first contact point with EA was reading a bunch of 80k articles which didn't quite seem to have me as part of their target audience, I was very positively surprised by how warm and caring and non-elitist the community was.

I learned to get these things out of EAG(x)s as well. But, had the fancy professional events been my ... (read more)

Yep - it reflects how many things in EA already work implicitly. That's one of the things I love about EA. And, I think it would be good if we use this as an explicit model more often, too.

If you want to dive a little bit deeper into these kinds of management practices, you may want to have a look into the Reinventing Organizations-wiki: https://reinventingorganizationswiki.com/en/theory/decision-making/

If you want to dive very, very deep, Frederik Laloux's "Reinventing Organizations" might be a worthwhile read. I'm halfway through, and it helped me build a whole bunch of intuitions for how to do community building better.

Love it! That bit slipped my mind and seems like a super relevant addition. Thanks a lot.

My personal gold standard of good organizing is the Advice Process. Description by Burning Nest:

"The general principle is that anyone should be able to make any decision regarding Burning Nest.

Before a decision is made, you must ask advice from those who will be impacted by that decision, and those who are experts on that subject.

Assuming that you follow this process, and honestly try to listen to the advice of others, that advice is yours to evaluate and the decision yours to make."[1]

One of the problems the Advice Process tackles is what anarchist vision... (read more)

0
Amber Dawn
1y
Cool! I've never heard of this, and it does indeed sound like a good process.

Yep, expectation-setting like that is super valuable.

I've also written a short facilitation handbook a couple months ago. It's useful for meetups, workshops, and basically any other kind of work with groups. Optimizing for psychological safety is implicit in a bunch of things there.

Thanks! Yep, the "socials is all people want." is a bit of a hyperbole. In addition to the TEAMWORK talks, we also have the Fake Meat - Real Talk reading/discussion group dinners, and will have a talk at the next monthly social, too.

The one-day career workshops sound great, added to the to-do list.

Thanks! Yep, retreats like that are high-ish on the to-do list.

Helps in some situations, yea.

At the same time, in EA, having access to spare cash and potential for impact are not necessarily highly correlated. So, if this becomes the only solution, it might make a bunch of extremely high EV conversations just not happen.

Thanks! Yep, that is totally in line with the fact that the Karma score of the post here is much more mixed than on LessWrong, which definitely is an Askier sphere than EA.

Strong upvote!

I'm constantly putting some effort into automatizing information flows.

E.g., I asked an EA Berlin community member to write a how-to on finding housing in Berlin, because I get that question at least once a month.

If you have more ideas for how to automatize such things, I'd be excited to read about them.

No hero worship at all intended, sorry if it came off like that. I agree with you that way too much of that happens in EA. Rockwell's "On living without idols" is with quite some distance my favorite piece on the EA Forum, and one of my favorite texts on all of the internet.

I'm one of the ~1% of EAs who have a natural tendency to ask for favors too leniently rather than too cautiously, so I would have appreciated knowing these things earlier. The core target audience of this post is people like me.

However, I do think the things I write here might be useful... (read more)

4
Stuart Buck
1y
Just a note: this post could have opposite advice for people from guess culture rather than ask culture. See https://ask.metafilter.com/55153/Whats-the-middle-ground-between-FU-and-Welcome#830421 I.e., someone from ask culture might need to be warned not to bother people so much. Someone from guess culture might need to be told that it is ok to reach out to people once in a while.

Thanks! Yep, I'm definitely an outlier in EA regarding how much I don't care about authority.

I added section 7 a couple hours after publication to account for feedback on the lesswrong side of this post. Now also added a disclaimer at the start:

"Note: The intended message of this post is not "Don't reach out to busy people!", but "Do reach out, and have these things in mind to make it more likely to get a response/if you don't get one." "

Since writing this, I've done a bunch more debating and thinking about how to handle romantic attraction in communities I'm actively involved in responsibly. So, here's the rule I want to commit to from now on:

In any community I'm involved in, I won't be the one driving romantic escalation (or hinting at it) with anyone lower in the institutional hierarchy than me. This applies within 1 month after low-intensity interactions like a 90min workshop and 3 months after high-intensity interactions like a retreat where I was in a lead facilitator role.

Some speci... (read more)

I agree with that statement, and I didn't intend to make either of those claims.

I think a more steelmanned version of my initial claim would be that there's a particular type of struggling that corresponds to low-integrity behavior, and that some aspects of current EA culture make it more likely for people to struggle in that particular way. Even (and maybe especially) if they are generally caring and well-meaning and honestly dedicated to the cause.

I think "scarcity mindset" is an okay handle.

A postrationalist friend also pointed out that what I'm talking about corresponds to Buddhism's realm of hungry ghosts. In modern psychological... (read more)

Yup, I definitely overgeneralized here and may be completely off. I think there's something where I'm pointing at, and this helps me clarify my thinking. So thanks.

Generally: I by no means want to demonize anyone for struggling. To a significant extent, I buy into a social model of mental health, and mostly see one person's struggling as a symptom of their whole surrounding (social and other) being diseased.

My intention behind this post was to point out some ways in which I think EA is suboptimally organized. The rough claim I was aiming for is this: "It's easier to be a saint in paradise, so let's make EA a bit more paradisic by fixing some of our norms."

1
Matt Goodman
1y
I agree with that rough claim. And I liked the rest of the blog. I guess I do see people who are struggling behaving badly sometimes. I just don't think it's in any more frequent than the general population. Or I see sometimes see them using the fact they're struggling to justify their bad behaviour, and I don't buy that.

Yep, I agree with that point - being untrustworthy and underresourced are definitely not the same thing.

I partially agree.

I love that definition of elites, and can definitely see how it corresponds to to how money, power, and intellectual leadership in EA revolves around the ancient core orgs like CEA, OpenPhil, and 80k.

However, the sections of Doing EA Better that called for more accountability structures in EA left me a bit frightened. The current ways don't seem ideal, but I think there are innumerable ways how formalization of power can make institutions more rather than less molochian, and only a few that actually significantly improve the way things ar... (read more)

"How does your last point fit in there though?"

On second thought, I covered anything that's immediately relevant to this topic in section 2.2, which I quickly expanded from the Facebook post this is based on. So yea, 3. should probably be a different EA Forum post entirely. Sorry for my messy reasoning here.

I'll add more object-level discussion of 3. under Kaj Sotala's comment.

Thanks for writing this up. I agree with most of these points. However, not with the last one:

I think we should see “EA community building” as less valuable than before, if only because one of the biggest seeming success stories now seems to be a harm story. I think this concern applies to community building for specific issues as well.

If anything, I think the dangers and pitfalls of optimization you mention warrant different community building, not less. Specifically, I see two potential dangers to pulling resources out of community building:

  1. Funded commun
... (read more)
7
Ulrik Horn
1y
I have not thought much about this and do not know how far this applies to others (might be worth running a survey) but I very much appreciate the EA community. This is because I am somewhat cause agnostic but have a skillset that might be applied to different causes. Hence, it is very valuable for me to have some community that ties together all these different causes as it makes it easier for me to find work that I might have a good fit for helping out with. In a scenario where EA did not exist, only separate causes (although I think Holden Karnofsky only meant to make less investments in EA, not abandoning the project altogether) I would need to keep updated on perhaps 10 or more separate communities in order to come by relevant opportunities to help.

A hedging I'd add: "...unless these people know each other from outside the boardgame club".

"We established a policy that established members, especially members of the executive, were to refrain from hitting on or sleeping with people in their first year at the society."

This sounds super reasonable for EA, too. How would you enforce/communicate this?

In my club this was done informally, by just telling people the rule, and telling people to knock it off if we saw them violate it, which was sufficient for us. 

EA is larger, so you'd have to think harder about enforcement/communication, and the various edge cases. It would certainly depend on the different contexts of different places. The goal of such a policy would be to:

  1. Reduce the number of extreme power imbalance relationships.
  2. Avoid turning off new members. 
  3. Reduce the number of people treating EA primarily as a dating service. 

You have ... (read more)

A hedging I'd add: "...unless these people know each other from outside the boardgame club".

Full disclosure, because without it, this post would be a bit phony: I haven't always followed this policy within EA or outside, and took just one or two weeks from first thinking it might be good to implement it in EA to writing this post.

In general, if I write about community dynamics, assume that I think about them this thoroughly not because I'm extraordinary virtuous and clear-sighted in regards to people stuff, but because I'm sometimes socially a bit clumsy and all these models and methods help me function at a level that just comes naturally to oth... (read more)

Yep, I'm with Xavier here. The rule incentivizes community builders a bit to not make EA their only social bubble (which is inherently good I think). And it is not without workarounds, all of which cushion the addressed problem.

For example, it encourages local community builders to hand over event facilitation to others more often. And if the rule is publicly known, participants can take a break from events that one leader leads to get around the rule. If participants don't know the rule, they'd get informed about its existence when they hit on an organizer. In either case, the consequence of even intentionally working around the rule would be taking it slow.

Yup, "don't hit on people who don't hit on me first." is a weaker rule I already decided to adhere to in EA before I started thinking about the one outlined in this post. Independent of power, it just seems utterly necessary to manage the gender imbalance.

Yep, the problem this particular rule tries to fix is that of perceived power imbalance and all the troubles that come with it.

It is an imperfect proxy for sure, but non-proxy rules like "No dating if there is a perceived power imbalance." are very, very prone to tempt people into motivated reasoning. It can get very hard for humans to evaluate their power imbalance with Alice when oh damn are these freckles cute. False beliefs, from the inside, feel not like beliefs, but like the truth. Because of that, I wouldn't trust anyone with power who would trust t... (read more)

I know that the rule is non-negotiable for people who facilitate retreats under the AuthRev brand.

AuthRev is rather influential in the (especially north american) AR scene, so I wouldn't be surprised if the rule seeped out further from there. I'm not well-networked enough there to know the details. And even if I could, I don't think I'd want to share the saucy stories that lead to people adjusting the timelines upward and downward until they found their current form.

Thanks a lot! Yep, a question I always ask myself in EA's diversity discussions is "Which kind of diversity are we talking about?"

A LessWrong post on the topic you might like if you didn't read it yet is Kaj Sotala's "You can never be universally inclusive".

Don't ask what EA can do for you, ask what you can do for EA.

An obvious-in-hindsight statement I recently heard from a friend:

"If I'd believe that me being around was net negative for EA, I'd leave the community."

While this makes complete sense in theory, it is emotionally difficult to commit to it if most of your friends are in EA. This makes it hard for us to evaluate our impact on the community properly. Motivated reasoning is a thing.

So, it may be wothwhile for us to occasionally reflect on the following questions:

  • If I were to look back in ten years an
... (read more)
3
Nathan Young
1y
Yeah I've definitely stopped doing things that I think will harm the community (I've reduced flirting a lot). But that said I think the kinds of people likely to reduce behaviours are (unlike me) the people who least need to. I think for most people, they need not worry. And for those that do, there are ways they can avoid harmful patters - avoid events where those patterns occur, go on courses, talk to friends and develop strategies to avoid them. I don't think we need to be martyr's here, and for 99.9% of people there is a way for their social needs to be met in the community. But like 1% of people will have to change a bit.

Whoopsie, I'm insufficiently aware of the English language conventions there. Thanks, changed.

A fancy version might be some form of integration between the EA Forum and e.g. Kialo, where forum accounts can be used to partake in the discussion trees, and forum posts can be used as discussion contributions.

This shifted my opinion towards being agnostic/mildly positive about this public statement.

I'm still concerned that some potential versions of EA getting more explicitly political might be detrimental to our discourse norms for the reasons Duncan, Chris, Liv, and I outlined in our comments. But yea, this amount of public support may definitely nudge grantmakers/donators to invest more into community health. If yes, I'm definitely in favor of that.

Ok these strong down- and disagreement-votes are genuinely mysterious to me now.

The only interpretation that comes to mind is that somebody expects that something bad could come from this offer. I can't imagine anything bad coming from this offer, so  I'd appreciate feedback. Both here where I can react, or in my admonymous is fine.

Thanks, that's encouraging feedback!

Anyplace else you think I should advertise this? I already got the first booking. But given the mixed voting score, I don't expect this post to still be read by anyone 2-3 days from now.

8
Severin
1y
Ok these strong down- and disagreement-votes are genuinely mysterious to me now. The only interpretation that comes to mind is that somebody expects that something bad could come from this offer. I can't imagine anything bad coming from this offer, so  I'd appreciate feedback. Both here where I can react, or in my admonymous is fine.
6
Holly Morgan
1y
EA Peer Support group on Facebook? (I don't understand the downvotes by the way 🤷 Maybe there's some broader context I'm missing.)

Thanks, I removed my downvote after reading this comment.

Edit: I no longer agree with the content of this comment. Jason convinced me that this pledge is worth more than just applause lights. In addition, I don't think anymore that this is a very appropriate place for a slippery slope-argument.

_____________
I'd like to explain why I won't sign this document, because a voice like mine seems to still be missing from the debate: Someone who is worried about this pledge while at the same time having been thoroughly involved in leftist discourse for several years pre-EA.

So here you go for my TED talk.

I'm not a Sam in ... (read more)

I've attended an online LessWrong Community Weekend co-organized by Linda and vouch for her capability to organize unconferences way beyond the level of what I thought possible.

0
Linda Linsefors
1y
Thanks :)
2
Chris Leong
1y
What did you think worked so well about these unconferences?

Agreed, "knowledge capital" fits well.

And though I sometimes sound a whole lot Slytherin, I absolutely don't want to normalize using the Dark Arts in EA community building. I'll change the term in the initial post and link to this comment thread.

Do you have a link to a smooth definition of "ideational capital"? I googled your citation and found a book, but apparently my skill in deciphering political science essays has massively declined since university.

A meta-level remark: I notice I'm a bit emotionally attached to "memetic capital", because I've thought about these things under the term "memetics" a bunch during the last year. In addition, a person whose understanding of cultural evolution I admire uses to speak about it in terms of memetics, so there's some matters of tribal belonging at play ... (read more)

2
MaxRa
1y
Thanks for the elaboration, no, that all felt very comprehensible to me! Hopefully not ghibbering too much as well: Knowledge capital? I agree that "idea" is more strongly connected with propositional knowledge, which is suboptimal, and that "knowledge" seems preferable as it covers the other types you bring up. I just googled "knowledge capital" and that might fit nicely and seems like it's actually an existing term from economics that overlaps a lot with what you have in mind (though of course missing the side-benefit of alluding to potentially neglected forms of analysis that you mention). From Investopedia: Memetic's lack of ephasis on argument still worries me One potential point of disagreement that is also related to my discomfort with memetics is this sentence: I generally feel icky about strategies that try to affect the cultural development of a community when this is not done transparently and deliberately. And memes kind of feel unilateral and uncooperative this way. For example say you'd like the EA community to engage more with authentic relating training to prevent unproductive and unnecessary conflicts. One way to do this would be to make a case for it on the EA forum, giving arguments and let people see and get excited about your vision. Another approach would be to think about ways to put authentic relating into easily digestable chunks that include positive vibes about authentic relating and an association of authentic relating with effectiveness or whatever. :D You bring up that memetics helps with understanding failure modes in cultural evolution, and that seems benign and useful to me. But I'd wish that the interventions based on that understanding  are also done transparently and cooperatively, and while you probably agree with that, I still worry that a memetic approach tends to emphasize the cooperative improvement and truth-seeking less than e.g. "cultivating EA' s shared knowledge (capital)".
Load more