Added Sep 26 2019: I'm not going to do an analysis or summary of these responses – but I and others think it would be interesting to do so. If you'd like to do so, I'd welcome that and will link your summary/analysis in the top of this post here. All the data is accessible in the Google Spreadsheet below.
Submit your answers anonymously here: https://docs.google.com/forms/d/e/1FAIpQLSfiUmvT4Z6hXIk_1xAh9u-VcNzERUPyWGmJjJQypZb943Pjsg/viewform?usp=sf_link
See the results here: https://docs.google.com/forms/d/e/1FAIpQLSfiUmvT4Z6hXIk_1xAh9u-VcNzERUPyWGmJjJQypZb943Pjsg/viewanalytics?usp=form_confirm
And you can see all responses beyond just the first 100 here: https://docs.google.com/spreadsheets/d/1D-2QX9PiiisE2_yQZeQuX4QskH57VnuAEF4c3YlPJIA/edit?usp=sharing
Inspired by: http://www.paulgraham.com/say.html
Let's start with a test: Do you have any opinions that you would be reluctant to express in front of a group of your peers?
If the answer is no, you might want to stop and think about that. If everything you believe is something you're supposed to believe, could that possibly be a coincidence? Odds are it isn't. Odds are you just think what you're told.
Why this is a valuable exercise
Some would ask, why would one want to do this? Why deliberately go poking around among nasty, disreputable ideas? Why look under rocks?
I do it, first of all, for the same reason I did look under rocks as a kid: plain curiosity. And I'm especially curious about anything that's forbidden. Let me see and decide for myself.
Second, I do it because I don't like the idea of being mistaken. If, like other eras, we believe things that will later seem ridiculous, I want to know what they are so that I, at least, can avoid believing them.
Third, I do it because it's good for the brain. To do good work you need a brain that can go anywhere. And you especially need a brain that's in the habit of going where it's not supposed to.
Great work tends to grow out of ideas that others have overlooked, and no idea is so overlooked as one that's unthinkable. Natural selection, for example. It's so simple. Why didn't anyone think of it before? Well, that is all too obvious. Darwin himself was careful to tiptoe around the implications of his theory. He wanted to spend his time thinking about biology, not arguing with people who accused him of being an atheist.
Thanks to Khorton for the suggestion to do it as a Google form.
Hi Ben,
Thanks for this, this is useful (upvoted)
1. I think we disagree on the empirical facts here. EA seems to me unusually open to considering rational arguments for unfashionable positions. People in my experience lose points for bad arguments, not for weird conclusions. I'd be very perplexed if someone were not willing to discuss whether or not utilitarianism is false (or whether remote working is bad etc) in front of EAs, and would think someone was overcome by irrational fear if they declined to do so. Michael Plant believes one of the allegedly taboo opinions here (mental health should be a priority) and is currently on a speaking tour of EA events across the Far East.
2. This is a good point and updates me towards the usefulness of the survey, but I wonder whether there is a better way to achieve this that doesn't carry such clear reputational risks for EA.
3. The issue is not whether my colleagues have sufficient public accessible reason to believe that EA is full of good people acting in good faith (which they do), but whether this survey weighs heavily or not in the evidence that they will actually consider. i.e. this might lead them not to consider the rest of the evidence that EA is mostly full of good people working in good faith. I think there is a serious risk of that.
4. As mentioned elsewhere in the thread, I'm not saying that EA should embrace political level self-restraint. What I am saying is that there are sometimes reasons to self-censor holding forth on all of your opinions in public when you represent a community of people trying to achieve something important. The respondents to this poll implicitly agree with that given that they want to remain anonymous. For some of these statements, the reputational risk of airing them anonymously does not transfer from them to the EA movement as a whole. For other statements, the reputational risk does transfer from them to the community as a whole.
Do you think anyone in the community should ever self-censor for the sake of the reputation of the movement? Do you think scientists should ever self-censor their views?