NicholasKross

Pursuing an undergraduate degree
Working (0-5 years experience)

Bio

CS student, blogging and editing at https://www.thinkingmuchbetter.com/. PM me your fluid-g-increasing ideas

How others can help me

Looking for opportunities to do technical and/or governance work in AI alignment/safety.

How I can help others

Can help with strategic planning, operations management, social media marketing, graphic design.

Comments
32

Point 1: I said "Different from MIRI but correlated with each other". You're right that I should've done a better job of explaining that. Basically, "Yudkowksy approaches (MIRI) vs Christiano approaches (my incomplete read of most of the non-MIRI orgs). I concede 60% of this point.

Point 2: !!! Big if true, thank you! I read most of johnswentworths' guide to being an independent researcher, and the discussion of grants was promising. I'm getting a visceral sense of this from seeing (and entering) more contests, bounties, prizes, etc. for alignment work. I'm working towards the day when I can 100% concede this point. (And, based on other feedback and encouragement I've gotten, that day is coming soon.)

Good point about the secrecy, I hadn't heard of the ABC thing. The secrecy is "understandable" to the extent that AI safety is analogous to the Manhattan Project, but less useful to the extent that AIS is analogous to... well, the development of theoretical physics.

Not sure how relevant, but this reminds me of stories from inside Valve, the noted semi-anarchistly-organized game developer. People can move to any project they want, and there are few/no formal position titles. However, some employees have basically said that, because decision-making is sorta by consensus and some people have seniority and people can organize informally anyway, the result is a "shadow clique/cabal" that has disproportionate power. Which, come to think of it, would probably happen in the average anarchist commune of sufficient size.

TLDR just because the cliques don't exist formally, doesn't mean they don't exist.

Oh yeah, there's clustering networks showing mutual followers of e.g. Twitch streamers, it shouldn't be too hard to make this for the EA sphere on twitter.

Somebody ought to start an independent organization specifically dedicated to red-teaming other people and groups' ideas.

I could start this after I graduate in the Fall, or potentially during the summer.

DM me if you want to discuss organization / funding.

The Gates documentary was part of what pushed me towards "okay, earning-to-give is unlikely to be my best path, because there seems to be a shortage in people smart enough to run massive (or even midsized) projects well." I guess the lack of red-teaming is a subset of constrainedness (although is it more cognitive bias on the funders, vs lack of "people / orgs who can independently red-team ideas"? Prolly both).

FWIW, Elon Musk famously kiiiiiiinda had a theory-of-change/impact before starting SpaceX. In the biography (and the WaitButWhy posts about him), it notes how he thought about funding a smaller mission of sending mice to Mars, and used a material cost spreadsheet to estimate the adequacy of existing space travel technology. He also aggressively reached out to experts in the field to look for the "catch", or whether he was missing something.

This is still nowhere near good red-teaming/proving-his-hunch-wrong, though. He also didn't seem to do nearly as much talking-to-experts knowledge-base-building for his other projects (e.g. Neuralink).

And most groups don't even do that.

Find your old student's house, catch them escaping out a window during a drug bust, recruit them into your RV.

Load More