Thomas Kwa

Researcher @ MIRI
2221Berkeley, CA, USAJoined Feb 2020



Doing alignment research with Vivek Hebbar's team at MIRI.


Edit: I forgot to add, OP could have phrased this differently, saying that people with productive things to say (which I assume is what they may have meant by "better takes") would be busier doing productive work and have less time to post here. Which I don't necessarily buy, but let's roll with it. Instead, they chose to focus on EA orgs in particular.

The causal reason I worded it that way is that I wrote down this list very quickly, and I'm in an office with people who work at EA orgs and would write higher quality posts than average, so it was salient, even if it's not the only mechanism for having better things to do.

I also want to point out that "people who work at EA orgs" doesn't imply infinite conformity. It just means they fit in at some role at some organization that is trying to maximize good and/or is funded by OpenPhil/FTX (who fund lots of things, including lots of criticism). I frequently hear minority opinions like these:

  • Biosecurity is more pressing than alignment due to tractability
  • Chickens are not conscious and can't suffer
  • The best way to do alignment research is to develop a videogame as a testbed for multi-agent coordination problems
  • Alignment research is not as good as people think due to s-risk from near misses
  • Instead of trying to find AI safety talent at elite universities, we should go to remote villages in India

"Quality of person" sounds bad to me too. I also find it weird that someone already gave the same feedback on the shortform and the OP didn't change it.

Thanks for pointing this out. I just edited the wording.

I agree that this list is "lazy", and I'd be excited about someone doing a better analysis.

Of the 15 people other than me who commented on the shortform, I only remember ever meeting 4. I would guess that for shortforms in general most of the attention comes from the feed.

Pablo made a survey for the first 8 points, and people seem to agree most with 1 (newer EAs have worse takes on average) and 5 (meta/community stuff gets more attention), with mixed opinions about the rest.

I wouldn't be quick to dismiss (3-5) and (7) as factors we should pay attention to. These sorts of memetic pressures are present in many communities, and yet communities vary dramatically in quality. This is because things like (3-5) and (7) can be modulated by other facts about the community:

  • How intrinsically susceptible are people to clickbait?
  • Have they been taught things like politics is the mind-killer and the dangers of platforms where controversial ideas outcompete broadly good ones?
  • What is the variance in how busy people are?
  • To what degree do people feel like they can weigh in on meta? To what degree can they weigh in on cause areas that are not their own?
  • Are the people on EA Forum mostly trying for impact, or to feel like they're part of a community (including instrumentally towards impact)?

So even if they cannot be solely reponsible for changes, they could have been necessary to produce any declines in quality we've observed, and be important for the future.

Promoting shortforms to top-level posts, preserving replies. I wanted to do that with this, because reposting it as a top-level post wouldn't preserve existing discussion.

Having to wear masks would reduce the value of EAG by >20% for me, mostly due to making 1-1s worse.

EA forum content might be declining in quality. Here are some possible mechanisms:

  1. Newer EAs have worse takes on average, because the current processes of recruitment and outreach produce a worse distribution than the old ones
  2. Newer EAs are too junior to have good takes yet. It's just that the growth rate has increased so there's a higher proportion of them.
  3. People who have better thoughts get hired at EA orgs and are too busy to post. There is anticorrelation between the amount of time people have to post on EA Forum and the quality of person.
  4. Controversial content, rather than good content, gets the most engagement.
  5. Although we want more object-level discussion, everyone can weigh in on meta/community stuff, whereas they only know about their own cause areas. Therefore community content, especially shallow criticism, gets upvoted more. There could be a similar effect for posts by well-known EA figures.
  6. Contests like the criticism contest decrease average quality, because the type of person who would enter a contest to win money on average has worse takes than the type of person who has genuine deep criticism. There were 232 posts for the criticism contest, and 158 for the Cause Exploration Prizes, which combined is more top-level posts than the entire forum in any month except August 2022.
  7. EA Forum is turning into a place primarily optimized for people to feel welcome and talk about EA, rather than impact.
  8. All of this is exacerbated as the most careful and rational thinkers flee somewhere else, expecting that they won't get good quality engagement on EA Forum
Load More