Mark Xu

I do alignment research at the Alignment Research Center. Learn more about me at markxu.com/about

Topic Contributions

Comments

Increasing Demandingness in EA

I expect 10 people donating 10% of their time to be less effective than 1 person using 100% of their time because you don't get to reap the benefits of learning for the 10% people. Example: if people work for 40 years, then 10 people donating 10% of their time gives you 10 years with 0 experience, 10 with 1 year, 10 with 2 years, and 10 with 3 years; however, if someone is doing EA work full-time, you get 1 year with 0 exp, 1 with 1, 1 with 2, etc. I expect 1 year with 20 years of experience to plausibly be as good/useful as 10 with 3 years of experience. Caveats to the simple model:

  • labor-years might be more valuable during the present
  • if you're volunteering for a thing that is similar to what you spend the other 90% of your time doing, then you still get better at the thing you're volunteering for

I make a similar argument here.

'Dropping out' isn't a Plan

One key difference is that "continuing school" usually has a specific mental image attached, whereas "drop out of school" is much vaguer, making them difficult to compare between.

My bargain with the EA machine

Many people in EA depart from me here: they see choices that do not maximize impacts as personal mistakes. Imagine a button that, if you press it, would cause you to always take the impact-maximizing action for the rest of your life, even if it entails great personal sacrifice. Many (most?) longtermist EAs I talk to say they would press this button – and I believe them. That’s not true of me; I’m partially aligned with EA values (since impact is an important consideration for me), but not fully aligned.

I think there are people (e.g. me) that value things besides impact and would also press the button because of golden-rule type reasoning. Many people optimize for impact to the point where it makes them less happy.

How Many People Are In The Invisible Graveyard?

A title like "How many lives might have been saved given an earlier COVID-19 vaccine rollout?" would have given me much more information about what the post was about than the current title, which I find very vague.

Things I recommend you buy and use.

kindle's are smaller, have backlights, and the kindle store is a good user experience.

Consider trying the ELK contest (I am)

Note: I work for ARC.

I would consider someone a "pretty good fit" (whatever that means) for alignment research if they started out with a relatively technical background, e.g. an undegrad degree in math/cs, but not really having engaged with alignment before and they were able to come up with a decent proposal after:

  • ~10 hours of engaging with the ELK doc.
  • ~10 hours of thinking about the document and resolving confusions they had, which might involve asking some questions to clarify the rules and the setup.
  • ~10 hours of trying to come up with a proposal.

If someone starts from having thought about alignment a bunch, I would consider them a potentially "pretty good researcher" if they were able to come up with a decent proposal in 2-8 hours. I expect many existing (alignment) researchers to be able to come up with proposals in <1 hour.

Note that I'm saying "if (can come up with proposal in N hours), then (might be good alignment researcher)" and not saying the other implication also holds, e.g. it is not the case that "if (might be good alignment researcher), then (can come up with proposal in N hours)"

Consider trying the ELK contest (I am)

Can confirm we would be interested in hearing what you came up with.

Announcing "Naming What We Can"!

Ben Pace, Ben Khun, Ben Todd, Ben West, and Ben Garfinkel should all become the same person, to avoid confusion.

Things I recommend you buy and use.

Thanks for writing this up. Just ordered a misto, elastic laces, and a waterpik. My own personal list of recommendations is on https://markxu.com/things, but it lacks justifications. Feel free to ask me about any of the items though.

Load More