[ Question ]

What would EAs most want to see from a "rationality" or similar project in the EA space?

by Davis_Kingsley 8d13th Sep 20193 comments

11


One area that has often been discussed as an EA or meta-EA cause area is rationality development, whether that be in the form of "raising the sanity waterline", providing relevant training to certain key people in order to empower them, or something else entirely.

What aspect of this strikes you as most interesting or relevant? What would you be most excited about seeing out of a new group or project in this area?

New Answer
Ask Related Question
New Comment
Write here. Select text for formatting options.
We support LaTeX: Cmd-4 for inline, Cmd-M for block-level (Ctrl on Windows).
You can switch between rich text and markdown in your user settings.

3 Answers

I would like to see efforts at calibration training for people running EA projects. This would be useful for helping to push those projects in a more strategic direction, by having people lay out predictions regarding outcomes at the outset, kind of like what Open Phil does with respect to their grants.

I'd love to see a tool that people enjoy using which testably teaches rationality.

Perhaps and app or novel which leaves people making better decisions on common tests of bias.

I would be particularly interested in seeing this in regard to elections. How do you teach people to vote more in line with their own interests?

I am in the process of reading a book called The Righteous Mind by Jonathan Haidt and I think the theories and research on moral psychology Haidt discusses could be applied to this topic to create some interesting research / studies!