PabloAMC 🔸

Quantum algorithm scientist @ Xanadu.ai
1068 karmaJoined Working (6-15 years)Madrid, España

Bio

Participation
5

Hi there! I'm an EA from Madrid. I am currently finishing my Ph.D. in quantum algorithms and would like to focus my career on AI Safety. Send me a message if you think I can help :)

Comments
124

I am a bit confused by 2b. I would argue that the spirit of the 10% pledge is to donate part of your possible income. So if you have offers by $2X but instead take a direct impact job that you deem highly impactful for just $X, then you are donating close to $X already? In fact, the condition

and are able to receive it at any point in the future if you wish.

may be looked the other way round. If you can take the $2X job now, but you may not in the future (say, because you are changing fields), you may be donating more than just $X this year.

For what is worth, I think keeping cause neutrality is important: the spirit of the 10% pledge is to do the most good, not choose specific causes. I would find it reasonable to highlight reasons why one may consider cause X particularly effective, but not give a final answer on this.

Let us imagine you live in a couple but you are the only one currently getting an income, but you fully share finances. Would it be reasonable to donate half of 10%?

Suppose someone takes a direct-impact job that directly lowers the salary by double-digit percentage, particularly when changing careers. What is the best rule of thumb for incorporating that into the amount pledged?

I think you should explain in this post what the pledge people may take :-)

I am particularly interested in how to pledge more concrete. I have always thought that the 10% pledge is somewhat incomplete because it does not consider the career. However, I think it would be useful to make the career pledge more actionable.

1/6 might be high, but perhaps not too many orders of magnitude off. There is an interview in the 80000hours podcasts (https://80000hours.org/podcast/episodes/ezra-karger-forecasting-existential-risks/) about a forecasting contest in which experts and superforecasters estimated AI extinction risk in this century to be 1% to 10%. And after all, AI is likely to dominate the prediction.

It’s inevitable tulip farmer wages will go down if we airdrop an additional tulip farmer.

Maybe what is inevitable is the additional person will start producing something else.

Load more