(Edited for clarity and rigour.)
Basically, the answer can depend on why you assign such a high probability to solipsism, specifically to what extent those reasons are normative or empirical. Mostly empirical favours giving more to others, and mostly normative can favour closer to an even split.
If you assign X% credence to conjunctions of normative views for which, conditional on those normative views, these hold simultaneously:
- your leftover uncertainty about solipsism is purely empirical (not also normative),
- you assign at most, say, 95% probability to solipsism conditional on each of the normative views (and the rest to basically standard views about consciousness and existence),
- it seems better to help, say, 40 other potentially conscious beings even if solipsism is up to 95% likely to be true than to definitely help yourself, and
- the normative views all agree on enough cases like 2 available to you,
then those X% of you normative views by credence would recommend helping others, and on most approaches to normative/moral uncertainty, it would be better to use X% to 100% of your effort to help others.
Under something like the the property rights approach to moral uncertainty, where resources are allocated in proportion to your credence in normative views, if the other 100%-X% of your normative views were fully committed to (near-)egoism or solipsism, then I think you would spend 100%-X% on yourself.
You could take X% to be at least 99% and assign exactly 95% probability to solipsism on those normative views, and so around 95% probability to solipsism overall, but be required to spend at least 99% of your efforts on others.
If you aren't familiar with normative/moral uncertainty (moral uncertainty is a type of normative uncertainty), then the above probably won't make sense, and I'd recommend taking a look at some of the following:
- Making decisions under moral uncertainty and Making decisions when both morally and empirically uncertain
- https://www.moraluncertainty.com/
- https://reducing-suffering.org/two-envelopes-problem-for-brain-size-and-moral-uncertainty/
- https://www.happierlivesinstitute.org/report/property-rights/ - for an approach of splitting resources according to moral/normative uncertainty, not really covered in detail elsewhere
- https://forum.effectivealtruism.org/topics/normative-uncertainty-1 and https://forum.effectivealtruism.org/topics/moral-uncertainty , and references and tagged posts.