Doing alignment research with Vivek Hebbar's team at MIRI.
Should the EA Forum team stop optimizing for engagement?
I heard that the EA forum team tries to optimize the forum for engagement (tests features to see if they improve engagement). There are positives to this, but on net it worries me. Taken to the extreme, this is a destructive practice, as it would
I'm not confident that EA Forum is getting worse, or that tracking engagement is currently net negative, but we should at least avoid failing this exercise in Goodhart's Law.
I was thinking of reasons why I feel like I get less value from EA Forum. But this is not the same as reasons EAF might be declining in quality. So the original list would miss more insidious (to me) mechanisms by which EAF could actually be getting worse. For example I often read something like "EA Forum keeps accumulating more culture/jargon; this is questionably useful, but posts not using the EA dialect are received increasingly poorly." There are probably more that I can't think of, and it's harder for me to judge these...
The epistemic spot checker could also notice flaws in reasoning; I think Rohin Shah has done this well.
Note that people in US/UK and presumably other places can buy drugs on the grey market (e.g. here) for less than standard prices. Although I wouldn't trust these 100%, they should be fairly safe because they're certified in other countries like India; gwern wrote about this here for modafinil and the basic analysis seems to hold for many antidepressants. The shipping times advertised are fairly long but potentially still less hassle than waiting for a doctor's appointment for each one.
Thanks. It looks reassuring that the correlations aren't as large as I thought. (How much variance is in the first principal component in log odds space though?) And yes, I now think the arguments I had weren't so much for arithmetic mean as against total independence / geometric mean, so I'll edit my comment to reflect that.
Manifold markets related to this: