Topic Contributions


What psychological traits predict interest in effective altruism?

Fair. In that case this seems like a necessary prerequisite result for doing that deeper investigation, though, so valuable in that respect.

What psychological traits predict interest in effective altruism?

At least for myself, it wouldn't have been obvious in advance that there would be exactly two factors, as opposed to (say) one, three or four.

What psychological traits predict interest in effective altruism?

Perhaps more educated people are more happy with their career and thus more reluctant to change it?

Or just more invested in it - if you've spent several years acquiring a degree in a topic, you may be quite reluctant to go do something completely different.

For future studies, might be worth rephrasing this item in a way where this doesn't act as a confounder for the results? I'd expect people in their early twenties to answer it quite differently than people in their early forties.

FLI launches Worldbuilding Contest with $100,000 in prizes

I was thinking that if they insist on requiring it (and I get around actually participating), I'll just iterate on some prompts on wombo.art or similar until I get something decent

Sasha Chapin on bad social norms in EA

Because it also mentions woo, so I think it’s talking about a broader class if unjustified beliefs than you think.

My earlier comment mentioned that "there are also lots of different claims that seem (or even are) irrational but are pointing to true facts about the world." That was intended to touch upon "woo"; e.g. meditation used to be, and to some extent still is, considered "woo", but there nonetheless seem to be reasonable grounds to think that there's nonetheless something of value to be found in meditation (despite there also being various crazy claims around it).

My above link mentions a few other examples (out-of-body experiences, folk traditions, "Ki" in martial arts) that have claims around them that are false if taken as the literal truth, but are still pointing to some true aspect of the world. Notably, a policy of "reject all woo things" could easily be taken to imply rejecting all such things as superstition that's not worth looking at, thus missing out on the parts of the woo that were actually valuable.

IME, the more I look into them, the more I come to find that "woo" things that I'd previously rejected as not worth looking at because of them being obviously woo and false, are actually pointing to significantly valuable things. (Even if there is also quite a lot of nonsense floating around those same topics.)

I agree, but in that case you should say make it clear how your interpretation differs from the author’s. 

That's fair.

Sasha Chapin on bad social norms in EA

What makes you think it isn't? To me it seems both like a reasonable interpretation of the quote (private guts are precisely the kinds of positions you can't necessarily justify, and it's talking about having beliefs you can't justify) as well as a dynamic that feels like one that I recognize as one that has been occasionally present in the community. Fortunately posts like the one about private guts have helped push back against it.

Even if this interpretation wasn't actually the author's intent, choosing to steelman the claim in that way turns the essay into a pretty solid one, so we might as well engage with the strongest interpretation of it.

Sasha Chapin on bad social norms in EA

There are a few different ways of interpreting the quote, but there's a concept of public positions and private guts. Public positions are ones that you can justify in public if pressed on, while private guts are illegible intuitions you hold which may nonetheless be correct - e.g. an expert mathematician may have a strong intuition that a particular proof or claim is correct, which they will then eventually translate to a publicly-verifiable proof. 

As far as I can tell, lizards probably don’t have public positions, but they probably do have private guts. That suggests those guts are good for predicting things about the world and achieving desirable world states, as well as being one of the channels by which the desirability of world states is communicated inside a mind. It seems related to many sorts of ‘embodied knowledge’, like how to walk, which is not understood from first principles or in an abstract way, or habits, like adjective order in English. A neural network that ‘knows’ how to classify images of cats, but doesn’t know how it knows (or is ‘uninterpretable’), seems like an example of this. “Why is this image a cat?” -> “Well, because when you do lots of multiplication and addition and nonlinear transforms on pixel intensities, it ends up having a higher cat-number than dog-number.” This seems similar to gut senses that are difficult to articulate; “why do you think the election will go this way instead of that way?” -> “Well, because when you do lots of multiplication and addition and nonlinear transforms on environmental facts, it ends up having a higher A-number than B-number.” Private guts also seem to capture a category of amorphous visions; a startup can rarely write a formal proof that their project will succeed (generally, if they could, the company would already exist). The postrigorous mathematician’s hunch falls into this category, which I’ll elaborate on later.

As an another example, in the recent dialog on AGI alignment, Yudkowsky frequently referenced having strong intuitions about how minds work that come from studying specific things in detail (and from having "done the homework"), but which he does not know how to straightforwardly translate into a publicly justifiable argument.

Private guts are very important and arguably the thing that mostly guides people's behavior, but they are often also ones that the person can't justify. If a person felt like they should reject any beliefs they couldn't justify, they would quickly become incapable of doing anything at all.

Separately, there are also lots of different claims that seem (or even are) irrational but are pointing to true facts about the world.

Isaac Asimov: The Last Question

This is indeed a wonderful story!

This version has nicer line breaks, in my opinion.

Here's an audio version read by Leonard Nimoy.

How to succeed as an early-stage researcher: the “lean startup” approach

Draft and re-draft (and re-draft). The writing should go through many iterations. You make drafts, you share them with a few people, you do something else for a week. Maybe nobody has read the draft, but you come back and you’ve rejuvenated your wonderful capacity to look at the work and know why it’s terrible.

Kind of related to this: giving a presentation about the ideas in your article is something that you can use as a form of a draft. If you can't get anyone to listen to a presentation, or don't want to give one quite yet, you can pick some people whose opinion you value and just make a presentation where you imagine that they're in the audience.

I find that if I'm thinking of how to present the ideas in a paper to an in-person audience, it makes me think about questions like "what would be a concrete example of this idea that I could start the presentation with, that would grab the audience's attention right away". And then if I come up with a good way of presenting the ideas in my article, I can rewrite the article to use that same presentation.

(Unfortunately myself I have mostly taken this advice in its reverse form. I've first written a paper and then given a presentation of it afterwards, at which point I've realized that this is actually what I should have said in the paper itself.)

Load More