NicholasKross

Pursuing an undergraduate degree
Working (0-5 years experience)

Bio

CS student, blogging and editing at https://www.thinkingmuchbetter.com/. PM me your fluid-g-increasing ideas

How others can help me

Looking for opportunities to do technical and/or governance work in AI alignment/safety.

How I can help others

Can help with strategic planning, operations management, social media marketing, graphic design.

Comments
45

Agree, I don't see many "top-ranking" or "core" EAs writing exhaustive critiques (posts, not just comments!) of these critiques. (OK, they would likely complain that they have better things to do with their time, and they often do, but I have trouble recalling any aside from (debatably) some of the responses to AGI Ruins / Death With Dignity.)

Agreed. When people require literally everything to be written in the same place by the same author/small-group, it disincentives writing potentially important posts.

Strong agree with most of these points; the OP seems to not... engage on the object-level of some of its changes. Like, not proportionally to how big the change is or how good the authors think it is or anything?

Reminder for many people in this thread:

"Having a small clique of young white STEM grads creates tons of obvious blindspots and groupthink in EA, which is bad."

is not the same belief as

"The STEM/techie/quantitative/utilitarian/Pareto's-rule/Bayesian/"cold" cluster-of-approaches to EA, is bad."

You can believe both. You can believe neither. You can believe just the first one. You can believe the second one. They're not the same belief.

I think the first one is probably true, but the second one is probably false.

Thinking the first belief is true, is nowhere near strong enough evidence to think the second one is also true.

(I responded to... a couple similar ideas here.)

Who should do the audit? Here's some criteria I think could help:

  • Orgs that don't get a high/any % of their funding from the individuals/groups under scrutiny.
  • People who've been longtime community members with some level of good reputation in it.
  • Orgs that do kinda "meta" things about the EA movement, like CEA or Nonlinear (disclosure: I used to volunteer for Nonlinear).

Couple thoughts:

In society, a fundamental problem is the tradeoff between effort spent and information gained.

I could imagine a cursory "audit" that would catch blatant badness, while anything more subtle could take (for instance) experienced lawyers, forensic accountants, and other experts.

Not to mention, the access they'd need to these figures' businesses, organizations, relationships, communications... potentially anything and everything.

Most people wouldn't give such access, but I think you're right that with the unusual situation (1-2 people being a key nexus for funding/influence, inside a movement that tries to be more self-correcting than most), it makes more sense here.

Good point, I think I've heard this perspective before but forgot.

My thoughts on "both": in that case, I wonder if it's more like a merge, or more like a Jekyll/Hyde thing

Agree but leaning more towards Option B. Wish this was discussed more explicitly, since it's the question that determines whether this was "naive utilitarian went too far" (bad) or "sociopath using EA to reputation-launder" (bad). Since EA as a movement is soul-searching right now, it's pretty important to figure out exactly which thing happened, as that informs us what to change about EA to prevent this from happening again.

A comment I initially posted elsewhere in private: I have to wonder how much was reputation-laundering from the beginning... maybe it was just reputational-laundering among his friend group?

Like, if I was a competitive sociopath, who landed in an EA social group for auxiliary reasons, but wanted to launder my reputation with them, it wouldn't be as easy as reputation-laundering from the POV of the general public.

Think: putting your name on a university building VS pretending to be a semi-competent longtermist.

Load More