TLDR: We don't have some easy to summarise methodology and being rational is pretty hard. Generally we try our best and hold ourselves and each other accountable and try to set up the community in a way that encourages rationality. If what you're looking for is a list of techniques to be more rational yourself you could read this book of rationality advice or talk to people about why they prioritise what they do in a discussion group
Some meta stuff on why I think you got unsatisfactory answers to the other questions
I wouldn't try to answer either of the previous questions because the answers seem big and definitely incomplete. I don't have a quick summary for how I would resolve a disagreement with another EA because there are a bunch of overlapping techniques that can't be described in a quick answer.
To put it into perspective I'd say the foundation to how I personally try to rationally approach EA is in the Rationality A-Z book but that probably doesn't cover everything in my head and I definitely wouldn't put it forward as a complete methodology for finding the truth. For a specific EA spin just talking to people about why they prioritise what they prioritise is what I've found most helpful and an easy way to do that is in EA discussion groups (in person is better than online).
It is pretty unfortunate that there isn't some easy to summarise methodology or curriculum for applying rationality for charity current EA curricula are pretty focussed on just laying out our current best guess and using those examples along with discussion to demonstrate our methodology.
How is EA rational then?
I think the main thing happening in EA is that there is a strong personal, social, and financial incentive for people to approach their work "rationally". E.g people in the community will expect you to have some reasoning which led you to do what you're doing, and they'll feedback on that reasoning if they think it's missing an important consideration. From that spawns a bunch of people thinking about how to reason about this stuff more rationally, and we end up with a big set of techniques or concepts which seem to guide us better.
Bias and irrationality are huge problems today. Should I make an effort to do better? Yes. Should I trust myself? No – at least as little as possible. It’s better to assume I will fail sometimes and design around that. E.g. what policies would limit the negative impact of the times I am biased? What constraints or rules can I impose on myself so that my irrationalities have less impact?
So when I see an answer like “I think people [at EA] try pretty hard [… to be rational]”, I find it unsatisfactory. Trying is good, but I think planning for failures of rati... (read more)