Sometimes the high impact game feels weird, get over it.
I have been in lots of conversations recently where people expressed their discomfort in the longtermist communities spending (particularly at events).
I think that my general take here is "yeah I can see why you think this but get over it". Playing on the high impact game board when you have $40B in your bank account and only a few years to use it involves acting like you are not limited financially. If top AI safety researchers want sports cars because it will help them relax and therefore be more 0.01% mroe productive (and I trust their judgment and value alignment) they are welcome to my money. Giving them my money is winning and as far as I am concerned it's a far better use of money than basically anything else I could do. To be clear, I know that there are optics issues, community health issues etc. but sometimes we can spend money without worrying about these things (e.g. retreats for people already familiar with LT).
Yes this would feel weird, but am I really going to let my own feelings of weirdness stop me helping billions of people in expectation. That feels much more weird.
Superforecasters can predict more accurately if they make predictions at 1% increments rather than 2% increments. It either hasn't been studied, or they've found negative evidence, whether they can make predictions at lower % increments. 0.01% increments are way below anything that people regularly predict on; there's no way to develop the calibration for that. In my comment, I meant to point out that anyone who thinks they're calibrated enough to talk about 0.01% differences, or even just things close to that, is clearly not a fantastic researcher and we probably shouldn't give them lots of money.
A separate point that makes me uneasy about your specific example (but not about generally spending more money on some people with the rationale that impact is likely extremely heavy-tailed) is the following. I think even people with comparatively low dark personality traits are susceptible to corruption by power. Therefore, I'd want people to have mental inhibitions from developing taste that's too extravagant. It's a fuzzy argument because one could say the same thing about spending $50 on an uber eats order, and on that sort of example, my intuition is "Obviously it's easy to develop this sort of taste and if it saves people time, they should do it rather than spend willpower on changing their food habits."
But on a scale from $50 uber eats orders to spending $150,000 on a sports car, there's probably a point somewhere where someone's conduct too dissimilar to the archetype of "person on a world-saving mission." I think someone you can trust with a lot of money and power would be wise enough that, if they ever form the thought "I should get a sports car because I'd be more productive if I had one," they'd sound a mental alarm and start worrying they got corrupted. (And maybe they'll end up buying the sports car anyway, but they certainly won't be thinking "this is good for impact.")