reallyeli

reallyeli's Comments

If you value future people, why do you consider near term effects?

I suppose an example would be that increasing economic growth in a country doesn't matter if the country later gets blown up or something.

If you value future people, why do you consider near term effects?
Like how would I know if the world was more absorber-y or more sensitive to small changes?

I'm not sure; that's a pretty interesting question.

Here's a tentative idea: using the evolution of brains, we can conclude that whatever sensitivity the world has to small changes, it can't show up *too* quickly. You could imagine a totally chaotic world, where the whole state at time t+(1 second) is radically different depending on minute variations in the state at time t. Building models of such a world that were useful on 1 second timescales would be impossible. But brains are devices for modelling the world that are useful on 1 second timescales. Brains evolved; hence they conferred some evolutionary advantage. Hence we don't live in this totally chaotic world; the world must be less chaotic than that.

It seems like this argument gets less strong the longer your timescales are, as our brains perhaps faced less evolutionary pressure to be good at prediction on timescales of like 1 year, and still less to be good at prediction on timescales of 100 years. But I'm not sure; I'd like to think about this more.

If you value future people, why do you consider near term effects?

Hey, glad this was helpful! : )

To apply this to conception events - imagine we changed conception events so that girls were much more likely to be conceived than boys (say because in the near-term that had some good effects eg. say women tended to be happier at the time). My intuition here is that there could be long-term effects of indeterminate sign (eg. from increased/decreased population growth) which might dominate the near-term effects. Does that match your intuition?

Yes, that matches my intuition. This action creates a sweeping change a really complex system; I would be surprised if there were no unexpected effects.

But I don't see why we should believe all actions are like this. I'm raising the "long-term effects don't persist" objection, arguing that it seems true of *some* actions.

What would a pre-mortem for the long-termist project look like?
I'd maybe give a 10% probability to long-termism just being wrong.

What could you observe that would cause you to think that longtermism is wrong? (I ask out of interest; I think it's a subtle question.)

What are some historical examples of people and organizations who've influenced people to do more good?

Florence Nightingale? Martin Luther King Jr. ? Leaders of social movements? It seems to me that a lot of "standard examples of good people" are like this; did you have something else in mind?

If you value future people, why do you consider near term effects?

(Focusing on a subtopic of yours, rather than engaging with the entire argument.)

All actions we take have huge effects on the future. One way of seeing this is by considering identity-altering actions. Imagine that I pass my friend on the street and I stop to chat. She and I will now be on a different trajectory than we would have been otherwise. We will interact with different people, at a different time, in a different place, or in a different way than if we hadn’t paused. This will eventually change the circumstances of a conception event such that a different person will now be born because we paused to speak on the street.

I'm not so sure "all actions we take have huge effects on the future." It seems like a pretty interesting empirical question. I don't find this analogy supremely convincing; it seems that life contains both "absorbers" and "amplifiers" of randomness, and I'm not sure which are more common.

In your example, I stop to chat with my friend vs. not doing so. But then I just go to my job, where I'm not meeting any new people. Maybe I always just slack off until my 9:30am meeting, so it doesn't matter whether I arrive at 9am or at 9:10am after stopping to chat. I just read the Internet for ten more minutes. It looks like there's an "absorber" here.

Re: conception events — I've noticed that discussion of this topic tends to use conception as a stock example of an amplifier. (I'm thinking of Tyler Cowen's Stubborn Attachments.) Notably, it's an empirical fact that conception works that way (e.g. with many sperm, all with different genomes, competing to fertilize the same egg). If conception did not work that way, would we lower our belief in "all actions we take have huge effects on the future" ? What sort of evidence would cause us to lower our beliefs in that?

Now, when the person who is conceived takes actions, I will be causally responsible for those actions and their effects. I am also causally responsible for all the effects flowing from those effects.

Sure, but what about the counterfactual? How much does it matter to the wider world what this person's traits are like? You want JFK to be patient and levelheaded, so he can handle the Cuban Missile Crisis. JFK's traits seem to matter. But most people aren't JFK.

You might also have "absorbers," in the form of selection effects, operating even in the JFK case. If we've set up a great political system such that the only people who can become President are patient and levelheaded, it matters not at all whether JFK in particular has those traits.

Looking at history with my layman's eyes, it seems like JFK was groomed to be president by virtue of his birth, so it did actually matter what he was like. At the extreme of this, kings seem pretty high-variance. So affecting the conception of a king matters. But now what we're doing looks more like ordinary cause prioritization.

What is the average EA salary?

I don't know — sounds like you might have stronger views on this than me! : )

What is the average EA salary?

This is gonna vary a lot because there's not a "typical EA organization" — salary is determined in large part by what the market rate for a position is, so I'd expect e.g. a software engineer at an EA organization to be paid about the same as a software engineer at any organization.

Is there a more specific version of your question to ask? Why do you want to know / what's the context?

Load More