H

hbesceli

119 karmaJoined

Comments
9

Some EA psychological phenomena

Some things that people report in EA:

Are these EA phenomena? Also, are they psychological phenomena? 

These things (I guess excluding EA disillusionment), don’t just exist within EA they exist within society in general, so it’s plausibly unfair to call them EA phenomena. Though it also seems to me that for each of these things, there’s somewhat strong fit with EA, and EA culture.

Taking impostor syndrome as an example: EA often particularly values ambitious and talented people. Also, it seems to me there’s something of a culture of assessing and prioritising people on this basis. Insofar as it’s important for people to be successful within EA, it’s also important for people to be seen in a certain way by others (talented, ambitious etc.). In general, the stronger the pressure there is for people to be perceived in a certain way, the more prominent I expect impostor syndrome to be. 

(I’m a bit wary of ‘just so’ stories here, but my best guess is that this in fact explanatory). 

I think impostor syndrome and other things in this ballpark are often discussed as an individual/ psychological phenomena. I think such framings are pretty useful. And there’s another framing which is seeing it instead as a ~sociological phenomena - these are things which happen in a social context, as a result of different social pressures and incentives within the environment.

I don’t know quite what to conclude here, in a large part because I don’t know how common these things are within EA, and how this compares to other places (or even what the relevant comparison class is). Though tentatively, if I’m asking ‘What does it look like for EA to thrive?’, then part of my answer is ‘being an environment where impostor syndrome, burnout, impact obsession and EA disillusionment are less common’.

What’s going on with ‘EA Adjacents’? 

There’s a thing where lots of people will say that they are EA Adjacent rather than EA (funny post related to this). In particular, it seems to me that the closer to the core people are, the less inclined they are to identify themselves with EA. What’s going on here? I don’t know, but it’s an interesting trailhead to me. 

Plausibly there are some aspects of EA, the culture, norms, worldview, individuals, organisations etc. that people disagree with or don’t endorse, and so prefer to not identify as EAs. 

I’m unsure how much to treat this as reflective of a substantive issue vs. a quirk, or reflective of things being actually fine. At least in terms of EA being a ‘beacon for thoughtful, sincere, and selfless’, it seems a little bit worrying to me that some of the core members of the community aren’t willing to describe themselves as EA. 

Perhaps a way of getting to the heart of this is asking people something like: Imagine you’re talking to someone who is thoughtful, sincere and selfless. Would you recommend EA to them? Which parts? How strongly? Would you express any reservations? 

Looping back to the question of ‘What is it for EA to thrive?’, one answer is: It’s the kind of community that EA’s would strongly recommend to a thoughtful, sincere and selfless friend. 

(Maybe this is too strong - people will probably reasonably have disagreements about what aspects of EA are good and aren’t, and if everyone is very positive on EA in this way, this plausibly means that there’s not enough disagreement in the community. )

Incentives within EA

Here’s a story you could tell about academia. Academia, is in some sense supposed to be about generating knowledge. But it ends up being ineffective at doing this because of something something incentives. Eg. 

  • Academic jobs are highly competitive
  • In order to get an academic job, it’s more important to have done things like original research than things like replications. 
  • Things like replications are undersupplied, and the replication crisis happens. 

What are the incentives within EA? How does this affect how well EA ends up ‘doing the most good?’. I don’t have a full theory here, though I also suspect that there are ways in which incentives in EA can push against doing the most good. Professional EA group funding is one example:

  • Professional EA group organisers are often in a bit of a precarious position. Their job depends on their ability to get funding from organisations like CEA or EAIF. 
  • One of the main ways that EA group organisers are assessed is on the basis of things like how well they produce highly engaged EAs, or career plan changes or other such things (I think this is broadly true, though I don’t have a great insight into how CEA assesses groups).
  • Professional EA group organisers are incentivised to produce these kinds of things. Some potential problems here: It’s hard to assess what counts as a good eg. career, which pushes in the direction of non-standard career options being discounted, often it may make sense for someone to focus on building career capital over working at an EA organisation, but these kinds of things are less obviously/ legibly impactful…

What is the EA project? Also, who is it for? 

There’s a comment by Saulius on an old EA Forum post: ‘[...] I see EA as something that is mostly useful when you are deciding how you want to do good. After you figured it out, there is little reason to continue engaging with it [...]’.

I found this pretty interesting, and it inspired a bunch of thoughts. Here’s a couple of oversimplified models of what it is to pursue the EA project:

Model 1:

  • EA involves figuring out how to do the most good and doing it. 
  • Figuring out how to do the most good involves working out what cause area is most important and what career path to pursue. 
  • Doing it involves getting such a job and executing well on it. 

Model 2: 

  • EA involves figuring out how to do the most good and doing it. 
  • This is less of a two step process, and more of an ongoing cultivation of virtues and attitudes like scout mindset, effectiveness and so on. It involves constant vigilance, or constant attunement. It’s an ongoing process of development, personal, epistemic, moral etc. 

I see most EA community building efforts as mostly framed by model 1. For example EA Groups, the EA Forum (perhaps EAG/ EAGx as well, though this is less clear). It seems to me a common pattern for people to engage in these things heavily when getting involved in EA, and then when people are ‘in’ they stop engaging with them, and focus on executing at their job. 

Insofar as engaging with these things (EA groups, EA Forum etc.) is a key component of what it is to engage with EA, I’m inclined to agree with the above comment - once you’ve figured out what to do there’s little reason to continue engaging with EA. 

I’d like to see more community building efforts, or EA Infrastructure that’s framed around model 2 - things that give EA’s who are already ‘in’ a reason to continue engaging with EA, things that provide them with value in pursuing this project. 

I don’t think model 1 and 2 necessarily have to come into conflict. Or at least, I think it’s fine and good for there to be people that see EA as mostly being relevant to a career decision process. And, for people that want to treat the EA project as more like model 2 (an ongoing process of cultivating virtues like scout mindset, effectiveness and so on, I’d be excited to see more community building, or infrastructure which is designed to support them in these aims. 

What is it for EA to thrive? 

EA Infrastructure Fund's Plan to Focus on Principles-First EA includes a proposal:

The EA Infrastructure Fund will fund and support projects that build and empower the  community of people trying to identify actions that do the greatest good from a scope-sensitive and impartial welfarist view.

 

And a rationale (there's more detail in the post):

 

  • [...] EA is doing something special. 
  • [...]  fighting for EA right now could make it meaningfully more likely to thrive long term.
  • [...]  we could make EA much better than it currently is - particularly on the “beacon for thoughtful, sincere, and selfless” front. [...]

Here I’m spending some time thinking about this, in particular:

  • What does it mean for EA to thrive? 
  • What projects could push EA in the direction of thriving? 

 

(I work at EAIF. These are my personal views/ thoughts. I’m not speaking on behalf of EAIF here)

Thanks for writing this up, lots of interesting ideas for retreat activities which I hadn't previously seen/ though of!

Yes, we're open to accepting both for either for grants covering project in which people either intend to work full-time or part-time, and for either joint or individual applications. We don't have a strong preference for receiving any particular type of application within this.

What more can a paid organizer do?

It may be that paid organisers simply increases the scale of the things they do already - eg. putting on more discussion groups, talks, workshops etc. though it could also be that having increased capacity enables groups to test promising strategies that they wouldn't have previously been able to.

One reason for thinking that it should be possible for organisers to increase the scale of their activities (and for this to result in an increase in the value that the group produces) is that even the largest groups seem to reach a fraction of their target audience. If groups aren't limited by the available target audience, and the grants process means that groups aren't limited by organiser time or funding, it seems that groups are likely to be able to increase the value they produce.

I'd be curious to know more about how people to message were selected

There weren't any strong guidelines in selecting people just encouraging people to talk to their friends. I chose people to message based on a combination of 1) how interested I thought they'd be (either based on previous conversations about EA or my knowledge of their interests) 2) how close we are, and I'd imagine others used similar heuristics.

and how the messages were crafted.

Here's a message I used that I also put up as an example for others, but there was an emphasis on making the messages personal rather than using a stock message, and so I expect that the type of messages that people sent varied quite a bit.

'Hey, last year i took the 'GWWC pledge' - a commitment to donate 10% of my income to the charities I believe are most effective at improving the world. I'd be really interested in hearing what you think about the idea and whether it's something you'd consider - what do you think? And do you fancy hearing a quick spiel about it? Anyway, what are you up to over New Year’s, when am I going to see you next?'

10% of messages converting to pledges is incredible, but potentially so incredible as to be suspicious.

The success that people have with this probably varies a lot. In particular having spoked to the person about effective altruism before made success a lot more likely. I think there was probably a fairly strong self selection effect, with those who have a lot of potentially interested friends being the people that decided to do the messaging and report their successes, and so I don't think the average GWWC member would be as successful (but probably still enough to make it worth doing).

Also the data from messaging friends seems consistent with the 1/25 message to pledge ratio from GWWC's previous attempts at messaging people - I'd expect messaging friends to be higher than this as the personal connection with the person you're talking about the pledge too seems to be quite an important factor.