T

trevor1

172 karmaJoined Sep 2019

Comments
252

If EA currently

  1. is in the middle of a Dark Forest (e.g. news outlets systematically following emergent consumer interest in criticizing EA and everything it stands for)
  2. perceives themselves as currently being in the middle of a dark forest or at risk of already being in a dark forest (which might be hard to evaluate e.g. due to the dynamics described in Social Dark Matter
  3. expects to enter a dark forest at some point in the near future (or the world around them to turn into a dark forest e.g. if China invades Taiwan and a wide variety of norms go out the window)

then I imagine that it would be pretty difficult to design institutional constraints that are resilient to observation and exploitation by a wide variety of possible adversaries, and balancing those same institutional constraints to simultaneously be visible and credible/satisfying to a wide variety of observers?

Ah, my bad, I did a ctrl + f for "sam"! Glad that it was nothing.

That's interesting, it still doesn't show anywhere on my end. I took this screenshot around 7:14 pm, maybe it's a screen size or aspect ratio thing.

Important to note: I archived the Washington Post homepage here and it showed Robinson's op-ed, but when I went to https://www.washingtonpost.com itself immediately after, at ~5:38 pm San Francisco time, it was nowhere to be found! (I was not signed in for either case).

[This comment is no longer endorsed by its author]Reply

This entire thing is just another manifestation of academic dysfunction 

(philosophy professors using their skills and experience to think up justifications for their pre-existing lifestyle, instead of the epistemic pursuit that justified the emergence of professors in the first place).

It started with academia's reaction to Peter Singer's Famine, Affluence, Morality essay in 1972, and hasn't changed much since. The status quo had already hardened, and the culture became so territorial that whenever someone has a big idea, everyone with power (who already optimized for social status) had an allergic reaction to the memetic spread rather than the epistemics behind the idea itself.

The Dark Forest Problem implies that people centralizing power might face strong incentives to hide, act through proxies, and/or disguise their centralized power as decentralized power. The question is to what extent high-power systems are dark forests vs. the usual quid-pro-quo networks and stable factions.

Changing technology and applications for power, starting in the 1960s, implies that factions would not be stable and iterative trust is less reliable, and therefore a dark forest system was more likely to emerge.

Yep, that's the way it goes! 

Also, figuring out what's original and what's memetically downstream, is an art. Even more so when it comes to dangerous technologies that haven't been invented yet.

Ah, I didn't know about the EA handbook and would not have found out if not for this post, thanks! It looks pretty good and along with the CFAR handbook, I wish I had known about it many years ago.

Yeah, a lot of them are not openly advertised for good reasons. One example that's probably fine to talk about is NunoSempere's claim that EAforum is shifting towards catering to new or marginal users

The direct consequence is reducing the net quality of content on EAforum, but it also allows it to steer people towards events as they get more interested in various EA topics, where they can talk more freely without worrying about saying things controversial, or get involved directly with people working on those areas via face-to-face interaction. And it doesn't stop EAforum from remaining a great bulletin board for orgs to publish papers and updates and get feedback.

But at first glance, catering towards marginal users normally makes you think that they're just trying to do classic user retention. That's not what's happening; this is not a normal forum and that's the wrong way to think about it.

My thinking about EAforum over the years has typically been "Jesus, why on earth would they people deliberately set things like that" and then maybe a couple months later, maybe a couple years later, I start to notice a possible explanation, and I'm like "oooooooooooohhhhhh, actually, that might make a lot of sense, I wish I had noticed that immediately".

Large multi-human systems tend to be pretty complicated and counterintuitive, but it becomes way, way more so when most of the people are extremely thoughtful. Plus, the system changes in complicated and unprecedented ways as the world changes around it, or as someone here or there discovered a game-changing detail about the world, meaning that EAforum is entering uncharted territory and tearing down Schelling fences rather frequently.

Load more