Linch

"To see the world as it is, rather than as I wish it to be."

I work for the EA research nonprofit Rethink Priorities. Despite my official title, I don't really think of the stuff I do as "research." In particular, when I think of the word "research", I think of people who are expanding the frontiers of the world's knowledge, whereas often I'm more interested in expanding the frontiers of my knowledge, and/or disseminating it to the relevant parties.

I'm also really interested in forecasting.

People may or may not also be interested in my comments on Metaculus and Twitter:

Metaculus: https://pandemic.metaculus.com/accounts/profile/112057/

Twitter: https://twitter.com/LinchZhang

Comments

Is this a good way to bet on short timelines?

To the extent funding constraints are real, betting on short timelines can be seen as just a subset of wanting money now vs later more than other people. 

In that regard, it'd be reasonable to figure out mechanisms to borrow against your future income. I don't know how difficult it is to do in practice, but it's plausible you can figure out ways to do this with EAs if for various reasons (eg counterparty risk) standard financial institutions do not let you do this. 

Linch's Shortform

I'm interested in a collection of backchaining posts by EA organizations and individuals, that traces back from what we want -- an optimal, safe, world -- back to specific actions that individuals and groups can take.

Can be any level of granularity, though the more precise, the better.

Interested in this for any of the following categories:

  • Effective Altruism
  • Longtermism
  • General GCR reduction
  • AI Safety
  • Biorisk
  • Institutional decision-making
Introducing High Impact Athletes

Threading etiquette is confusing! It was unclear to me whether the right person to respond to was Ben, Marcus, or you. So I went for the most top-level comment that seemed reasonable. 

In retrospect I should have just commented on the post directly. 

Introducing High Impact Athletes

We all have different beliefs and intuitions about the world, including about how other people see the world. 

Compared to the rest of us, Marcus has a strong comparative advantage in both a) having an intuition for what messages work for professional athletes and would be easier for them to relate to, and more importantly, b) access to a network to test different messages.

I would personally be excited if, rather than for us to debate at length of what will or won't be appealing for a hypothetical audience, for Marcus to just go out and experiment with different messages with the actual audience that he has.

The results may or may not surprise us.

If Causes Differ Astronomically in Cost-Effectiveness, Then Personal Fit In Career Choice Is Unimportant

Ok so we have 3 different interpretations of what the confidence tag means for a very simple syllogism. 

If Causes Differ Astronomically in Cost-Effectiveness, Then Personal Fit In Career Choice Is Unimportant

I'm a little confused about your confidence tag. Your thesis is of the form

If P, then Q.

where P = "Causes Differ Astronomically in Cost-Effectiveness" and Q = "Personal Fit In Career Choice Is Unimportant."

You've tagged your post "Unlikely." It's not clear to me from the confidence tag whether you mean to imply that you think Q is unlikely by itself, or if the implicature "if P, then Q" is unlikely. From context, I think it's the former, but the latter seems like a reasonable reading as well.

What posts do you want someone to write?

I'd be interested in a post by a historian (or very serious amateur historian) on what EAs can learn from the rise and fall of Mohism, the earliest proto-consequentialist school of philosophy/social movement that I'm aware of*.

*I'd also be interested in a  more general summarization post  detailing other proto-consequentialist schools of philosophy and social movements.
 

Make a Public Commitment to Writing EA Forum Posts

If anybody finds minor monetary commitments more helpful than  (or helpful in addition to) minor reputational commitments, I'm happy to accept promises of relatively small amounts of $s (I recommend targeting more than $10 and less than $150) conditional upon you not writing the posts you said you'd write. 

I further commit here to counterfactually spend the money on pure hedonism, so you don't have to worry about being improperly incentivized by me donating that money instead.

(I was on the other side of these commitments in the past and have found them quite motivating).

nickmatt's Shortform

Semi-aside: Have you read Canticle for Leibowitz? I barely remember exact details from the book, but I read  it when I was very young, and plausibly it affected my priors on nuclear war somewhat, in a subtle/sneaky way.

Load More