410Joined Sep 2015


Team Lead for LessWrong


Topic Contributions

I think I agree with your clarification and was in fact conflating the mere act of speaking with strong emotion with speaking in a way that felt more like a display. Yeah, I do think it's a departure from naive truth-seeking.

In practice, I think it is hard, though I do think it is hard for the second order reasons you give and others. Perhaps an ideal is people share strong emotion when they feel it, but in some kind of format/container/manner that doesn't shut down discussion or get things heated. "NVC" style, perhaps, as you suggest.

Hey Shakeel,

Thank you for making the apology, you have my approval for that! I also like your apology on the other thread – your words are hopeful for CEA going in a good direction.

Some feedback/reaction from me that I hope is helpful. In describing your motivation for the FLI comment, you say that it was not to throw FLI under the bus, but because of your fear that some people would think EA is racist, and you wanted to correct that. To me, that is a political motivation, not much different from a PR motivation.

To gesture at the difference (in my ontology) between PR/political motivations and truth-seeking motivations:


  • you want people to believe a certain thing (even if it's something you yourself sincerely believe), in this case, that EA is not racist
  • it's about managing impressions and reputations (e.g. EA's reputation as not racist)

Your initial comment (and also the Bostrom email statement) both struck me as "performative" in how they demonstrated really harsh and absolute condemnation ("absolutely horrifying", "[no] place in this community", "recklessly flawed and reprehensible" – granted that you said "if true", but the tone and other comments seemed to suggest you did think it was true). That tone and manner of speaking as the first thing you say on a topic[1]  feels pretty out of place to me within EA, and certainly isn't what I want in the EA I would design.

Extreme condemnation pattern matches to someone signaling that they too punish the taboo thing (to be clear, I agree that racism should not be tolerated at all), as is seen on the lot of the Internet, and it feels pretty toxic. It feels like it's coming from a place of needing to demonstrate "I/we are not the bad thing".

So even if your motivation was "do your bit to make it clear that EA isn't racist", that does strike me as still political/PR (even if you sincerely believe it).

(And I don't mean to doubt your upsetness! It is very reasonable to be upset if you think something will cause harm to others, and harm to the cause you are dedicating yourself to, and harm to your own reputation through association. Upsetness is real and caring about reputation can come from a really good place.)

I could write more on my feelings about PR/political stuff, because my view is not that it's outright "bad/evil" or anything, more that caution is required. 

Truth-seeking / info-propagation
Such comments more focus on sharing the author's beliefs (not performing them)[2] and explaining how they reached them, e.g. "this is what I think happened, this is why I think that" and inferences they're making, and what makes sense. They tally uncertainty, and they leave open room for the chance they're mistaken.

To me, the ideal spirit is "let me add my cognition to the collective so we all arrive at true beliefs" rather than "let me tug the collective beliefs in the direction I believe is correct" or "I need to ensure people believe the correct thing" (and especially not "I need people to believe the correct thing about me").

My ideal CEA comms strategy would conceive of itself as having the goal of causing people to have accurate beliefs foremost, even when that makes EA look bad. That is the job – not to ensure EA looks good, but to ensure EA is perceived accurately, warts and all. 

(And I'm interested in attracting to EA people who can appreciate that large movements have warts and who can tolerate weirdness in beliefs, and gets that movement leaders make mistakes. I want the people who see past that to the ideas and principles that make sense, and the many people (including you, I'd wager) are working very hard to make the world better.)

I don't want to respond to step in the right direction (a good apology) with something that feels negative, but it feels important to me that this distinction is deeply understood by CEA and EA in general, hence me writing it up for good measure. I hope this is helpful.

ETA: Happy to clarify more here or chat sometime.

  1. ^

    I think that after things have been clarified and the picture is looking pretty clear, then indeed, such condemnation might be appropriate.

  2. ^

    The LessWrong frontpage commenting guidelines are "aim to explain, not persuade".

I came to the comments here to also comment quickly on Kathy Forth's unfortunate death and her allegations. I knew her personally (she subletted in my apartment in Australia for 7 months in 2014, but more meaningfully in terms of knowing her, we also we overlapped at Melbourne meetups many times, and knew many mutual people). Like Scott, I believe she was not making true accusations (though I think she genuinely thought they were true). 

I would have said more, but will follow Scott's lead in not sharing more details. Feel free to DM me.

Those accusations seem of a dramatically more minor and unrelated nature and don't update me much at all that allegations of mistreatment of employees are more likely.

The couple arguments against this do not likely hold up against the vast utility discrepancies from resource allocations...

This kind of utilitarian reasoning seems not too different from the kind that would get one to commit fraud to begin within. I don't think whether it's legally required to return or not makes the difference – morality does not depend on laws. If someone else steals money from a bank and gives it to me, I won't feel good about using that money even if I don't have to give it back and will use it much better.

Sounds an awful lot like LessWrong, but competition can be healthy[1] ;) 

  1. ^

    I think this is less likely to be true of things like "places of discussion" because splitting the conversation / eroding common knowledge, but I think it's fine/maybe good to experiment here.

I didn't scrutinize, but at a high-level, new intro article is the best I've seen yet for EA. Very pleased to see it!

I think 20% might be a decent steady-state but at the start of their involvement I think I'd like to see new aspiring community builders do something like six months on intensive object-level work/research.

Fwiw, my role is similar to yours, and granted that LessWrong has a much stronger focus on Alignment, but I currently feel that a very good candidate for the #1 reason that I will fail to steer LW to massive impact is because I'm not and haven't been an Alignment researcher (and perhaps Oli hasn't been either, but he's a lot more engaged with the field than I am).

Again, thanks for taking the time to engage.

I think this post is maybe a format that the EA Forum hasn't done before, but this is intended to be a repository of advice that's crowd-sourced. This is also maybe not obvious because I "seeded" it with a lot of content I thought was worth sharing (and also to make it less sad if it didn't get many contributions – so far a few).

As I wrote:

I've seeded this post with a mix of advice, experience, and resources from myself and a few friends, plus various good content I found on LessWrong through the Relationships tag. The starting content is probably not the very best content possible (if it was, why make this a thread?), but I wanted to launch with something. Don't anchor too hard on what I thought to include! Likely as better stuff is submitted, I'll move some stuff out of the post text and into comments to keep the post from becoming absurdly long.

I also solicit disagreement:

Please disagree with advice you think is wrong! (It probably makes sense to add notes/links about differing views next to advice items in the main text, so worth the effort to call out stuff you disagree with.)

If you're okay with it, I will add your points of disagreement into the main post.

It is definitely not comprehensive! I put this together within a few hours over the weekend, I did not aim to start off with everything that's relevant. (Somehow still reached 10k words). If someone has good content on childbirth, pregnancy, etc., I think that would be great to add. On reflection, I'm in favor it being a behemoth and people hunting for sections relevant to them and/or later distillation.

it's not clear to me that a lot of this is actually very good advice for a lot of people.

I agree, because giving universal advice is extremely hard. The approach I'd advise for this is for people to read it, and if it seems like a good idea for them, try it. But also "consider reversing all advice you hear".

I'm also not sure why you linked to a list of 'negative expectation value' 'infohazard' questions that you don't recommend people do?

Because it's funny and fun. Note that I didn't write the text around it. Like most of the text, it's stuff I copied in. Also, it's a not a major world-ending infohazard. It's clearly marked. And as a commenter wrote on LW, it's only an infohazard if your relationship is bad (I think his bar for good relationships is too high, but I agree that healthier relationships/people aren't at as much risk.

And finally, most bizarrely... why is 50% of the 'sex' advice section a survey on what it is like to have sex with one particular guy? 

Going back to how I was just seeding the crowd-sourced post with content I had, that was something I had on hand. I didn't have other stuff and didn't feel like going hunting for advice, but thought it'd be good if other people had things that wanted to recommend be added to that section. As I write in that section:

How to have good and healthy sex is beyond the intended scope for this thread, but I welcome people to add links to external resources here (or submit them via comments with spoiler text/warning, or the Google Form).

I agree that it'd be much better if that one link was not 50% of the list! But I actually thought it's a helpful read for people who don't find it TMI.

Load More