Originally this was a thread for coordinating conversations at EA Global 2019. In the end, only I ended up using the thread for top level comments, and in fact it turned out that a lot of the value was getting to quickly hash out some ideas that I hadn't felt ready to turn into fully fledged posts.
I'll probably continue using this for EA-related shortform posts, as a parallel to my lesswrong shortform feed.
Integrity, Accountability and Group Rationality
I think there are particular reasons that EA should strive, not just to have exceptionally high integrity, but exceptionally high understanding of how integrity works.
Some background reading for my current thoughts includes habryka's post on Integrity and my own comment here on competition.
What about Paul's Integrity for Consequentialists?