Hide table of contents

Question inspired by this comment. Other relevant discussion (from the same thread)

There have been a few posts lately on topics that some people seem reluctant to discuss publicly (sometimes there is a temporal element) - money in EA, political candidates, etc. I have seen this expressed in threads and to me personally in PMs. It seems like we may be missing out on valuable skepticism. 

Would non-public (only for logged in members, perhaps even with a minimum of karma) forum posts or comments be a solution to this?

Obviously, there is huge value in openness and keeping most things public, but if a discussion never happens, or is skewed in one direction because people don't want to publicly criticize, that's not good either. 

Certainly it would be bad if a lot of content was made non-public by default, or unnecessarily. Transparency is good most of the time.

The perception that there is a bunch "secret content" is probably not good either. How to weigh that with the benefits of people feeling more free to discuss things?

There is some precedent - some EA Facebook groups are private, for example. 

Maybe, rather than the poster deciding, there could be a way for logged-in users to vote on switching a post to non-public?

Maybe people could have a choice of checking a box to make their comment (and presumably all replies under it) non-public?

I suppose there is also the risk it could make people feel too comfortable and recklessly discuss info hazards, etc. 

Not sold either way on this.

[Edit, adding...]

A milder approach could be to just make posts less findable

  • There could be a check box for users that added a noindex tag to the post.
  • If someone doesn't want to draw attention from outside the community, they could use a codeword (and request that others do as well) for obvious search keywords - initials of a politician, etc. This is probably not all that reliable as commenters could do whatever they want.

21

0
0

Reactions

0
0
New Answer
New Comment

3 Answers sorted by

I've considered this as a feature for a while. I haven't really made up my mind on it, some considerations: 

  • I am worried about having a Streisand effect, where if you say something mildly controversial in public, it's actually less bad than if you tried at all to keep it private, where the second thing makes it sound a lot more juicy (compare "I found the following by infiltrating the Effective Altruist's private forum and here is a screenshot" vs. "I found the following on the Effective Altruism subreddit")
  • I do think it could be quite valuable for stuff that isn't really controversial, but that in some sense... is an advanced topic? But I am not sure whether a logged-in status is actually the right barrier here. Like, many topics I would like to be able to discuss, but I would just kind of prefer that it's a bit inconvenient to discuss, and that it wouldn't cloud the impression that newcomers have when they first show up to the forum.
  • I am kind of worried that lots of people would then make lots of posts for logged-in users, while forgetting a large group of readers that is actively following a lot of content on the EA Forum and is pretty plugged-into stuff, but isn't usually logged-in, and I haven't found a good way to make that tradeoff salient to authors, and currently think giving people the option would cause a lot of people to make a reflectively non-endorsed mistake.
  • First point is very good and I hadn't thought of it. I guess hiding something lessens the chance you get discovered, but always makes you appear more guilty if/when you are. I guess that that is more relevant the more people you think you have digging around for dirt.
  • Second point: Karma? But that does require you to be logged in of course.
  • Third: This could be addressed by not making it the choice of the poster, but by requiring a certain number of readers to click a "make this non-public" button. Then it's more of a community decides kind of thing. Of cour
... (read more)
9
JP Addison
2y
I haven’t advertised this, but authors can request noindex status and moderators can set it.

This option should be chosen only after alternatives that neither increase risk nor decrease transparency in EA have been assessed in more detail.

So while one of my comments caused this discussion, I don't think it's a great example - now I'd say that it was good that that was public.

I would like to see private areas experimented with in order to see if more substantive discussion happened on the forum. If less, then it would be a failure.

Comments1
Sorted by Click to highlight new comments since: Today at 11:39 PM

Summary: Excessive privacy measures on a forum for a movement that otherwise claims to be among the most transparent of all movements may easily provoke increased efforts by, for example, journalists to gain access to that information. A process by which some sensitive information is trusted as secured may also provoke a false sense that information hazards of  global significance are secured by the same means. That is insufficient, so there should be multiple procedures or protocols for securing different kinds of information actors in EA seek to keep private or secret. Discourse actors in EA seek to keep private should in general not be conducted on the EA Forum.

This is similar to more concern I've noticed being expressed in the last year over the ambiguous relationship effective altruism has with news media and (various interest groups among) the public as the movement has become more prominent in the last few years. 

One potential problem to bear in mind is the Streisand effect.  It's "a phenomenon that occurs when an attempt to hide, remove, or censor information has the unintended consequence of increasing awareness of that information, often via the Internet." [sic] The more effective altruism is suspected of hiding, the harder more people may try to gain access to that information. 

There isn't anything that prevents anyone from outside of EA from starting an account on the EA Forum. Limiting access to different sensitive discussions on the EA Forum to only some users with accounts may send an errant signal incentivizing those seeking private information to keep trying more. It may also arbitrarily limit transparency of discourse within EA itself. 

Another problem is information hazards and other kinds of information many in EA prefer remain private are conflated with each other. I do respect  concerns the negative publicization of some information privy to EA-affiliated organizations may have a dire impact on their capacity to do good. Yet a potential scandal about a political candidate or large sums of money can be resolved while the exposure of information dramatically increasing existential risk probably can't be. 

Any process by which some sensitive information in EA is trusted to be secure may generate a very risky and false sense of security it will be sufficient to also secure genuine information hazards. To minimize that risk, there should be distinct sets of procedures and protocols for securing information hazards, and all other kinds of information actors in EA may otherwise prefer to remain private.

More from Jeremy
Curated and popular this week
Relevant opportunities