80,000 Hours recently rewrote its approach for comparing problems against one another. This is how we give people advice on which problems are most 'pressing', and so are most promising for people aiming to have a large social impact with their career. We recommend checking it out.
This framework is a work in progress and is likely to be further iterated in future.
The biggest changes since the last version are:
- Redefining 'solvability' from a qualitative scoring system to 'the % of the problem expected to be solved' by a 'doubling of resources dedicated to solving the problem'. For a problem where progress is easy, a doubling of the resources allocated to fixing it might reduce the damage it does by 10-100%. For one where progress is slow, it might only solve an additional 1%. This adjustment makes the model mathematically clean and require fewer assumptions, though assigning scores for solvability remains difficult.
- We downgraded the value of economic growth in the rich world, but added a new yardstick for promoting economic growth in the developing world to reflect its higher humanitarian value.
- When evaluating 'neglectedness' we now only measure the resources that are allocated with the intention of solving a problem, rather than also those which might accidentally do so (which proved impractical). The full document explains why this is OK.
- We've added additional details about how to assign scores, to ensure consistent standards across problems.
- The underlying mathematics are now properly explained.
In designing the framework we've benefitted from the work of Owen Cotton-Barratt at the Future of Humanity Institute in particular.
You could potentially use this process to write your own profiles of problems you or others in the community might work on, and we would be interested to see the results.
We also recently rewrote our profile of global priorities research - that is, prioritising different global problems as a profession. We hope it's now easier to take action after reading it. If you can see yourself conducting that research in your career, let us know and we might be in touch.
Creating a well-defined mathematical underpinning for the neglectedness-tractability-importance framework is a really cool non-trivial accomplishment. Thanks for helping further arm all of us cause prioritizers. :)
If the funding for a problem with known total funding needs (e.g. creating drug x which costs $1b) goes up 10x, its solvability will go up 10x too - how do you resolve that this will make problems with low funding look very intractable? I guess the high neglectedness makes up for it. But this definition of solvability doesn't quite capture my intuition.
Don't the shifts in solvability and neglectedness perfectly offset one another in such a case? Can you write out the case you're considering in more detail?