Sometimes the high impact game feels weird, get over it.
I have been in lots of conversations recently where people expressed their discomfort in the longtermist communities spending (particularly at events).
I think that my general take here is "yeah I can see why you think this but get over it". Playing on the high impact game board when you have $40B in your bank account and only a few years to use it involves acting like you are not limited financially. If top AI safety researchers want sports cars because it will help them relax and therefore be more 0.01% mroe productive (and I trust their judgment and value alignment) they are welcome to my money. Giving them my money is winning and as far as I am concerned it's a far better use of money than basically anything else I could do. To be clear, I know that there are optics issues, community health issues etc. but sometimes we can spend money without worrying about these things (e.g. retreats for people already familiar with LT).
Yes this would feel weird, but am I really going to let my own feelings of weirdness stop me helping billions of people in expectation. That feels much more weird.
Sometimes the high impact game feels weird, get over it.
I have been in lots of conversations recently where people expressed their discomfort in the longtermist communities spending (particularly at events).
I think that my general take here is "yeah I can see why you think this but get over it". Playing on the high impact game board when you have $40B in your bank account and only a few years to use it involves acting like you are not limited financially. If top AI safety researchers want sports cars because it will help them relax and therefore be more 0.01% mroe productive (and I trust their judgment and value alignment) they are welcome to my money. Giving them my money is winning and as far as I am concerned it's a far better use of money than basically anything else I could do.
Yes this would feel weird, but am I really going to let my own feelings of weirdness stop me helping billions of people in expectation. That feels much more weird. a
Superforecasters can predict more accurately if they make predictions at 1% increments rather than 2% increments. It either hasn't been studied, or they've found negative evidence, whether they can make predictions at lower % increments. 0.01% increments are way below anything that people regularly predict on; there's no way to develop the calibration for that. In my comment, I meant to point out that anyone who thinks they're calibrated enough to talk about 0.01% differences, or even just things close to that, is clearly not a fantastic researcher and we ... (read more)