howdoyousay?

Posts

Sorted by New

Comments

Buck's Shortform
I feel like an easy way to get lots of upvotes is to make lots of vague critical comments about how EA isn’t intellectually rigorous enough, or inclusive enough, or whatever. This makes me feel less enthusiastic about engaging with the EA Forum, because it makes me feel like everything I’m saying is being read by a jeering crowd who just want excuses to call me a moron.

Could you unpack this a bit? Is it the originating poster who makes you feel that there's a jeering crowd, or the people up-voting the OP which makes you feel the jeers?

As counterbalance...

Writing, and sharing your writing, is how you often come to know your own thoughts. I often recognise the kernel of truth someone is getting at before they've articulated it well, both in written posts and verbally. I'd rather encourage someone for getting at something even if it was lacking, and then guide them to do better. I'd especially prefer to do this given I personally know that it's difficult to make time to perfect a post whilst doing a job and other commitments.

This is even more the case when it's on a topic that hasn't been explored much, such as biases in thinking common to EAs or diversity issues. I accept that in liberal circles being critical on basis of diversity and inclusion or cognitive biases is a good signalling-win, and you might think it would follow suit in EA. But I'm reminded of what Will MacAskill said about 8 months ago on an 80k podcast that he was awake thinking his reputation would be in tatters after posting in the EA forum, that his post would be torn to shreds (didn't happen). For quite some time I was surprised at the diversity elephant in the room on EA, and welcomed when these critiques came forward. But I was in the room and not pointing out the elephant for a long time because I - like Will - had fears about being torn to shreds for putting myself out there, and I don't think this is unusual.

I also think that criticisms of underlying trends in groups are really difficult to get at in a substantive way, and though they often come across as put-downs from someone who wants to feel bigger, it is not always clear whether that's due to authorial intent or reader's perception. I still think there's something that can be taken from them though. I remember a scathing article about yuppies who listen to NPR to feel educated and part of the world for signalling purposes. It was very mean-spirited but definitely gave me food for thought on my media consumption and what I am (not) achieving from it. I think a healthy attitude for a community is willingness to find usefulness in seemingly threatening criticism. As all groups are vulnerable to effects of polarisation and fractiousness, this attitude could be a good protective element.

So in summary, even if someone could have done better on articulating their 'vague critical comments', I think it's good to encourage the start of a conversation on a topic which is not easy to bring up or articulate, but is important. So I would say go on ahead and upvote that criticism whilst giving feedback on ways to improve it. If that person hasn't nailed it, it's started the conversation at least, and maybe someone else will deliver the argument better. And I think there is a role for us as a community to be curious and open to 'vague critical comments' and find the important message, and that will prove more useful than the alternative of shunning it.

Should we think more about EA dating?

Hypocrite 101 here as I am dating / have dated EAs, but anyway...

The problem this post is trying to solve is "EAs are a bit too weird for other people", and the proposed solution is "let's pair up romantically". This solution would, in my opinion, aggravate another significant problem which is best laid out by this post here about risks of insularity within the community from excessive value alignment. The writer makes a much more rigorous than I am about to, but I think one element of it applies to this case: the quote "EA will miss its ambitious goal by working with only an insular subset of the people it is trying to save."

Having friendships / relationships outside EA would diversify your own thought as well as potentially diversifying the pool of people interested in EA / EA thinking. So if you accept the arguments of this post that insularity / strong value-alignment is a threat, then friendships / relationships outside of EA are intrinsically valuable. Dating other EAs does not in itself create cultural and community insularity, but encouraging it as a solution to a problem of EAs not being great at external social integration would entrench community insularity.

The best counter-argument is that promoting friendships / relationships / any social interaction outside of EA won't go far enough, and that the real problem is insularity at leadership levels within EA, that's what we should break and give the non-dating a break. Which I think is fair. But withstanding that, still benefits for individuals or local groups (e.g. city-based) around external integration.

Other counter-arguments:

Most liberals marry liberals; most cultists marry cultists; people marry those from their fellow religion (and hunt them on dating apps) this is normal for humans to assortatively mate?

Or why can't we have friends who bring us diversity instead?

Longtermism ⋂ Twitter

Some words of caution here which I want to be brief with to (ideally) set someone up for taking down in a steel-man.

The tl;dr version is Twitter excels at meming misinformed, outraged takes on nuanced things.

First off, EA and in particular long-termism has some vocal detractors who do not seem to use the same norms as most people on the EAF.

Second, Twitter is a forum which people who dislike an event / idea can easily weaponise to discredit the thing and the poster, and do so through (sometimes deliberate) misinterpretation. So it's plausible that long-termist posts on Twitter - if not steel-manned rigorously beforehand - would be vulnerable for this. For example, any post not triple-checked could be retweeted with a misinterpreting comment that argues how long-termism is a bad ideology, and provoke a negative meme-and-outrage-cascade / pile-on.

Third, even with excellent codes of conduct in place (and I agree with disseminating the EAF CoC more widely where possible), an actor who wants to misinterpret something can and will. There is a fairly substantial risk that, should this happen, it would skew the discourse on long-termism outside EA for quite some time, and it may prove very challenging to reset this.

The above are some hot-takes, which I genuinely thought about *not* posting because I haven't had time to mull over them much but thought better to do it than not.

Also, I genuinely hope I'm wrong (especially because I hate being the Helen Lovejoy "won't someone please think of the (future) children?!" voice!) - I think it would be helpful for someone to give some arguments against those or propose some potential mitigations, maybe those seen in other Twitter forums?

Please use art to convey EA!

Kurzgesagt communicates some complex ideas using visualisations and reframing which are also quite effective, and possibly could learn from. Their video on time is a good example of this.

Please use art to convey EA!

Thank you for posting this. I massively laud giving slightly 'left field' approaches a go, and I think you've raised an important issue about communicating about EA movement and thinking generally.

My reply rests on a few some assumptions, which I hope are not too unfair - happy for critique / challenge on them.

The OP's point about art is worth considering in the context of another question: how can we communicate our thinking (in all its diversity and complexity) accurately and effectively to people outside the community?

Whilst I laud the OP's ambition, it's worth thinking about the intermediate steps between logical reasoning (which I observe is our default) and art; using metaphor and analogy to illustrate points. (To note: I believe some animal charities do this already, using the Schindler's car example to influence actions regarding factory farming.)
Before giving arguments in favour, here's an example: video explaining a new type of cancer treatment, CAR-T cell therapy


Some brief arguments in favour:

1) Metaphors / analogies can create an 'aha' moment where the outline of a complex idea is grasped easily and retained by the listener, which they can then layer nuance on top of. People might otherwise not grasp certain complex EA ideas so easily.

2) Whilst explaining a position in logical sequence with great attention to detail is often effective for influencing (and is the main communication approach observed in this forum), I assume that lots of people are not 'hooked' by that approach, or find the line of reasoning too abstract to wish to change their mindset of behaviour in response to it.

3) Metaphors / analogies can be more memorable, and therefore transfer from person to person or 'spread' better than prosaic reasoning.

4) If you assume that people often have weak attention spans and inaccurate recollection memory, then 1-3 are even stronger arguments in favour of using metaphors more.

The examples the OP chooses (e.g. Dr Strangelove) prove that communicating an idea through art requires the artist's ambition to be matched with huge skill, so this strikes me as 'high risk, high gain' territory. But we can probably make some decent gains by developing some metaphorical or allegorical ways of communicating EA thinking, testing them out and iterating.....and THEN seeing if people who we want to communicate our messages to apprehend them better.