Hide table of contents

Epistemic status: Highly speculative quick Facebook post. Thanks to Anna Riedl for nudging me to share it here anyway.

Something I've noticed recently is that some people who are in a bad place in their lives tend to have a certain sticky sleazy black holey feel to them. Something around untrustworthiness, low integrity, optimizing for themselves regardless of the cost for the people around them. I've met people like that, and I think when others felt around me like my energy was subtly and indescribably off, it was due to me being sticky in that way, too.

Game-theoretically, it makes total sense for people to be a bit untrustworthy while they are in a bad place in their life. If you're in a place of scarcity, it is entirely reasonable to be strategic about where you put your limited resources. Then, it's just reasonable to only be loyal to others as long as you can get something out of it yourself and to defect as soon as they don't offer obvious short-term gains. And similarly, it make sense for the people around you to be a bit weary of you when you are in that place.

And now, a bit of a hot take: I think most if not all of Effective Altruism's recent scandals have been due to low-integrity sticky behavior. And, I think some properties of EA systematically make people sticky.

We might want to invest some thought and effort into fixing them. So, here's some of EA's sticky-people-producing properties I can spontaneously think of, plus first thoughts on how to fix them that aren't supposed to be final solutions:

1. Utilitarianism

Yudkowsky wrote a thing that I think is true: 

"Go three-quarters of the way from deontology to utilitarianism and then stop. You are now in the right place. Stay there at least until you have become a god." 

Meanwhile, SBF and probably a bunch of other people in EA (including me at times) have gone all four quarters of the way. If there's no upper bound to when it is enough to make the numbers go up, you'll be in a place of scarcity no matter what, and will be incentivized to defect indefinitely.

I think an explicit belief of "defecting is not a good utilitarian strategy" doesn't help here: Becoming sticky is not a decision, but a subtle shift in your cognition that happens when your animal instincts pick up that your prefrontal cortex thinks you are in a place of scarcity.

Basically, I think Buddhism is what utilitarianism would be if it made sense and was human-brain-shaped: Optimizing for global optima, but from a place of compassion and felt oneness with all sentient beings, not from the standpoint of a technocratic puppet master.

2. Ever-precarious salaries

EA funders like to base their allocation of funds on evidence, and they like to be able to adjust course quickly as soon as there are higher expected-value opportunities. From the perspective of naive utilitarianism, this completely makes sense.

From the perspective of grantees, however, it feels like permanently having to justify your existence. And that is a situation that makes you go funny in the head in a way that is not conducive to getting just about any job done, unless it's a job like fraud that inherently involves short-term thinking and defecting on society. Whether or not you treat people as trustworthy and competent, you'll tend to find that you are right.

I don't know how to fix this. Especially at the place we are at now, where both the FTX collapse and funders' increased cautiousness made the precarity of EA funding even worse. Currently, I'm seeing two dimensions to at least partially solving this issue:

  1. Building healthier, more sustainable relationships between community members. That's why I'm building Authentic Relating Berlin in parallel to EA Berlin, and think about ways to safely(!) encourage memetic exchange between these communities. This doesn't help with the precarious funding itself, but with the "I feel like I have to justify my existence!"-aspect of writing a grant application.
  2. We might want to fundamentally redesign our institutions so that peopel feel trusted and we elicit trustworthy behavior in them.[1] For example, we might somehow want to offer longer-term financial security to community members that doesn't just rip off when they want to switch projects within the EA ecosystem. To give people more leeway, and to trust them more to do the best they can with the money they receive. I've found some organizations that had awesome success with similar practices in Frederic Laloux's "Reinventing Organizations", including a French manufacturing company named FAVI and the Dutch healthcare organization Buurtzorg. Some examples for EA meta work that I think are good progress towards finding forms of organizing that produce trustworthy people are Charity Entrepreneurship, the things Nonlinear builds (e.g. the Nonlinear Network),  AI Safety Support, alignment.wiki,  the various unconferences I've seen happening over the last years, as well as the Future Matters Project, a Berlin-based, EA-adjacent climate movement building org.

3. A not quite well-managed personal/professional overlap

EA sort of wants to be a professional network. At the same time, the kinds of people who tend to grow interested in EA have a lot of things in common they find few allies for in the rest of the world. So, it's just obvious that they also want to be friends with each other. Thus grow informal friend circles with opaque entry barriers everywhere around the official professional infrastructure. Thus grow house parties you'll want to get invited to so you can actually feel part of the tribe, and so you can tap into the informal high-trust networks which actually carry the weight of the professional infrastructure.

Some of the attempts within EA to solve this seem to be to push even more towards just being a professional network. I think that's dangerously wrong, because it doesn't remove the informal networks and their power. It just makes access to them harder, and people more desperate to get in.

Plus, humans are social animals, and if you stop them from socialling, they'll stop showing up.

I think the solution lays in exactly the opposite direction: Creating informal networks with low entry barriers and obvious ways in, so that feeling like you belong to the tribe is not something you have to earn, but something you get for free right at the start of your EA journey. That's what I've been working on with EA Berlin's communication infrastructure over the last months. Now, I'm trying to figure out how to interface it more graciously with impact-focused outreach and meetups.

  1. ^

    This is the aspect of this post I'm most unsure about.

53

0
0

Reactions

0
0

More posts like this

Comments12
Sorted by Click to highlight new comments since: Today at 4:02 PM

(Upvoted.)

Some of the attempts within EA to solve this seem to be to push even more towards just being a professional network. I think that's dangerously wrong, because it doesn't remove the informal networks and their power. It just makes access to them harder, and people more desperate to get in.

Somewhat relevant counterpoint:

For everyone to have the opportunity to be involved in a given group and to participate in its activities the structure must be explicit, not implicit. The rules of decision-making must be open and available to everyone, and this can happen only if they are formalized. This is not to say that formalization of a structure of a group will destroy the informal structure. It usually doesn't. But it does hinder the informal structure from having predominant control and make available some means of attacking it if the people involved are not at least responsible to the needs of the group at large. [...]

... an elite refers to a small group of people who have power over a larger group of which they are part, usually without direct responsibility to that larger group, and often without their knowledge or consent. [...] Elites are nothing more, and nothing less, than groups of friends who also happen to participate in the same political activities. They would probably maintain their friendship whether or not they were involved in political activities; they would probably be involved in political activities whether or not they maintained their friendships. It is the coincidence of these two phenomena which creates elites in any group and makes them so difficult to break.

These friendship groups function as networks of communication outside any regular channels for such communication that may have been set up by a group. If no channels are set up, they function as the only networks of communication. [...] 

Some groups, depending on their size, may have more than one such informal communications network. [...] In a Structured group, two or more such friendship networks usually compete with each other for formal power. This is often the healthiest situation, as the other members are in a position to arbitrate between the two competitors for power and thus to make demands on those to whom they give their temporary allegiance.

I partially agree.

I love that definition of elites, and can definitely see how it corresponds to to how money, power, and intellectual leadership in EA revolves around the ancient core orgs like CEA, OpenPhil, and 80k.

However, the sections of Doing EA Better that called for more accountability structures in EA left me a bit frightened. The current ways don't seem ideal, but I think there are innumerable ways how formalization of power can make institutions more rather than less molochian, and only a few that actually significantly improve the way things are done. Specifically, i see two types of avenues for formalizing power in EA that would essentially make things worse:

  1. Professional(TM) EA might turn into the outer facade of what is actually still run by the now harder to reach and harder to get into traditional elite. That's the concern I already pointed towards in the post above.
  2. The other way things could go wrong was if we built something akin to modern-day democratic nation states: Giant sluggish egregores of paperwork that reliably produce bad compromises nobody would ever have agreed to from first principles, via a process that is so time-consuming and ensnaring to our tribal instincts that nobody has energy left to have the important truth-seeking debates that could actually solve the problems at hand.

Personally, the types of solutions I'm most excited about are ones that enable thousands of people to coordinate decentralizedly around the same shared goal without having to vote or debate everything out. I think there are some organizations out there that have solved information flows and resource allocation way more efficiently than not only hierarchical technocratic organizations like traditional corporations, socialist economies, or the central parts of present-day EA, but also more efficiently than modern democracies.

For example, in regards to collective decisionmaking, I'm pretty excited about some things that happen in new social movements, the organizations that Frederic Laloux described (see above, or directly on https://reinventingorganizationswiki.com/en/cases/), or the Burning Man community. 

A decisionmaking process that seems to work in these types of decentralized organizations is the Advice Process. It is akin to how many things are already done in EA, and might deserve to be the explicit ideal we aspire to.

Here's a short description written by Burning Nest, a UK-based Burning Man-style event:

"The general principle is that anyone should be able to make any decision regarding Burning Nest.

Before a decision is made, you must ask advice from those who will be impacted by that decision, and those who are experts on that subject.

Assuming that you follow this process, and honestly try to listen to the advice of others, that advice is yours to evaluate and the decision yours to make."

Of course, this ideal gets a bit complicated if astronomical stakes, infohazards, the unilateralist's curse, and the fact that EA is spread out over a variety of continents and legal entities enter the gameboard.

I don't have a clear answer yet for how to make EA at large more Advice Process-ey, and maybe what we currently have actually is the best we can get. But, I'm currently bringing the way EA Berlin works closer and closer to this. And as I've already learned, this works way better when people trust each other, and when they trust me to trust them. The Advice Process is basically built on top of the types of high-trust networks that can only emerge if people with similar values are also allowed to interact in non-professional ways.

Therefore, if we optimize away from making the personal/professional overlap work, we might rob ourselves of the possibility to implement mechanisms like the Advice Process that might help us solve a bunch of our coordination problems, but require large high-trust networks to work effectively. Other social movements have innovated on decisionmaking processes before EA. It would just be too sad if we wouldn't hold ourselves to higher standards here than copying the established and outdated management practices of pre-startup era 20th century corporations.

Thanks for sharing your thoughts, I particularly appreciated you pointing out the plausible connection between experiencing scarcity and acting less prosocially / with less integrity. And I agree that experiencing scarcity in terms of social connections and money is unfortunately still sufficiently common in EA that I'm also pretty worried when people e.g. want to systematically tone down aspects that would make EA less of a community.

Game-theoretically, it makes total sense for people to be a bit untrustworthy while they are in a bad place in their life. If you're in a place of scarcity, it is entirely reasonable to be strategic about where you put your limited resources. Then, it's just reasonable to only be loyal to others as long as you can get something out of it yourself and to defect as soon as they don't offer obvious short-term gains.

One reservation I had, I think it'd be useful not to mix together trustworthiness vs. the ability to contribute to common resources and projects and to be there for friends. Trustworthiness to me captures things like being honest, only committing to things you expect to be able to do, being cooperative as a default. Even if I experience a lot of scarcity, I aspire to stay just as trustworthy. And that e.g. includes warning others up front that I have very little extra energy and I might have to stop contributing to a shared project at any time.

Yep, I agree with that point - being untrustworthy and underresourced are definitely not the same thing.

Good idea! Scarcity mindset is such an annoying tendency of human psychology. We should all become more like hippies.

How does your last point fit in there though? Scarcity mindset because you so desperately want to become part of the friends? Or friends as medicine for more trustworthiness, thus more stable networks, more trusting fundors, and more stable salaries? 

On a side note within the community, not only the precarious salaries, also the very high valuing of working in prestigous EA organisations vs. lack of opportunities may contribute to the problem. People want to belong and to be respected. Let's not forget to actually encourage people to seek all sorts of opportunities - according to their theory of change - and give a lot of respect and praise for trying and for failing

Yep, all of you putting energy into changing the world for the better - you deserve all the recognition! Change is non-linear and opportunities very random, so putting yourself out there and taking the risk of having an impact is the way to go! <3 

"How does your last point fit in there though?"

On second thought, I covered anything that's immediately relevant to this topic in section 2.2, which I quickly expanded from the Facebook post this is based on. So yea, 3. should probably be a different EA Forum post entirely. Sorry for my messy reasoning here.

I'll add more object-level discussion of 3. under Kaj Sotala's comment.

I feel like there's a large leap in the ToC between "throw more parties", "make funding less related to results" and producing more high integrity actors.

I agree with that statement, and I didn't intend to make either of those claims.

Curated and popular this week
Relevant opportunities