When comparing different problems, I found myself with moral questions about what we should care about, whose answers had a direct impact on the scale of global problems. 80k suggests a concept called moral uncertainty, which consists on maximizing the expected value of our decisions. For that it is necessary to estimate the probability of moral assertions (for example: non-human animal lives matter equally as human ones), but I haven't found resources for that. I would be very grateful if someone had a clue on how to proceed.
I think it's highly subjective and intuition-based for most people. For a very basic moral claim X, you would just ask: "How likely does X seem to me?" And then the probability that occurs directly to you is what you go with.
You might consider arguments for or against, but ultimately just pick a number directly for many claims. For other claims, you might derive them from others, e.g. multiplying (conditional) probabilities for each premise in an argument to get the probability of the conclusion (or a lower bound on it).