I currently have something like 50% credence that the point of no return will happen by 2030. Moreover, it seems to me that there's a wager for short timelines, i.e. you should act as if short timelines scenarios are more likely than they really are, because you have more influence over them. I think that I am currently taking short timelines scenarios much more seriously than most people, even most people in the AI safety community. I suppose this is mostly due to having higher credence in them, but maybe there are other factors as well.
Anyhow, I wonder if there are ways for me to usefully bet on this difference.
Money is only valuable to me prior to the point of no return, so the value to me of a bet that pays off after that point is reached is approximately zero. In fact it's not just money that has this property. This means that no matter how good the odds are that you offer me, and even if you pay up front, I'm better off just taking out a low-interest loan instead. Besides, I don't need money right now anyway, at least to continue my research activities. I'd only be able to achieve significant amounts of extra good if I had quite a lot more money.
What do I need right now? I guess I need knowledge and help. I'd love to have a better sense of what the world will be like and what needs to be done to save it. And I'd love to have more people doing what needs to be done.
Can I buy these things with money? I don't think so... As the linked post argues, knowledge isn't something you can buy, in general. On some topics it is, but not all, and in particular not on the topic of what needs to be done to save the world. As for help, I've heard from various other people that hiring is net-negative unless the person you hire is both really capable and really aligned with your goals. But IDK.
There are plenty of people who are really capable and really aligned with my goals. Some of them are already helping, i.e. already doing what needs to be done. But most aren't. I think this is mostly because they disagree about what needs to be done, and I think that is largely because their timelines are longer than mine. So, maybe we can arrange some sort of bet... for example, maybe I could approach people who are capable and aligned but have longer timelines, and say: "If you agree to act as if my timelines are correct for the next five years, I'll act as if yours are correct thereafter."