Hide table of contents

Here’s a pattern that we noticed. You probably have some large goal you’re optimising for, like “learn to build great software”. But day-to-day, you optimise for a proxy goal, like “write code which my supervisor says is good”.

Proxy goals are often useful: they’re easier to evaluate, which makes it easier to make quick decisions.

For example, if your big goal is to write a bestselling novel, this is hard to evaluate - to find out if you’re achieving it, you might need to survey readers or even publish the novel. It’s easier to use the proxy of writing a novel that you think is great.

 

But sometimes, it’s unclear that your proxy tracks your big goal.

For example, if your big goal is to help solve the alignment problem, you might find yourself using the proxy goal of doing things that other people in the EA community think are prestigious.

However, there’s uncertainty here. Prestige within the community doesn’t always align with impact. Working on an established research agenda might be immediately impressive, while building your own research agenda might be confusing or illegible to other community members. 

Examples of proxy goals that might be misaligned with big goals:

  • Big goal: doing impactful research. Proxy goal: publishing papers in prestigious venues.
    • Possible failure mode: doing research that is easier/faster to publish but less impactful.
  • Big goal: provide for my family. Proxy goal: work really hard. 
    • Possible failure mode: working so hard you are not emotionally present for your family.
  • Big goal: reduce the risk of a nuclear strike. Proxy goal: reach a powerful position in the US government.
    • Possible failure mode: to climb the ranks, you take actions that have a negative impact, and it’s unclear if this negates your hypothetical future positive impact in expectation.

What to do about this

  1. Be aware that you are using proxy goals.
  2. Periodically step back to notice how your proxy goals misalign with your big goals.
  3. Re-engineer your proxy goals or incentive environment to make sure that the work you’re doing actually promotes your big goal.

 

Thanks to Isaac Dunn and Jarred Filmer for great conversations that led to this.

11

0
0

Reactions

0
0
Comments1
Sorted by Click to highlight new comments since: Today at 9:25 AM

So I think I agree with the general point, but having thought about this a fair bit recently too, I think that there is something tricky that is missing from your presentation. You write:

  1. Periodically step back to notice how your proxy goals misalign with your big goals.

  2. Re-engineer your proxy goals or incentive environment to make sure that the work you’re doing actually promotes your big goal.

The issue is that lots of things actually do require deep, sustained effort to make progress on, or it may require deep sustained effort to get the necessary skills for something. So to some extent you can't just periodically re-adjust...you really do need to place a bet on something and double down on it, or you may forever be tacking your trajectory and never really having any impact on anything. This is really hard to figure out.

More speculatively, there is a broader critique of EA here too which has been mentioned recently in other posts where it just feels nicer and cooler and more interesting to keep 'EA' as your central thing - as the core part of your identity - rather than being like 'OK I've decided the best thing to do is to get a job in, say, this government department' and to just double down on that. Once you do the latter it might feel like you kinda leave EA behind more than you wanted to.

Curated and popular this week
Relevant opportunities