H

HenryStanley

1562 karmaJoined Jun 2017

Comments
146

This is wonderful – thank you so much for writing it.

Mutual dedication to one another’s ends seems like a thing commonly present in religious and ethnic communities. But it seems quite uncommon to the demographic of secular idealists, like me. Such idealists tend to form and join single-focus communities like effective altruism, which serve only a subset of our eudaemonic needs.

Agree about secular, single-purpose communities – but I'm not sure EA is quite the same.

I've found my relationships with other EAs tend to blossom to be about more than just EA; those principles provide a good set of shared values from which to build other things, like a sense of community, shared houses, group meals, playing music together and just supporting each other generally. Then again, I don't consider EA to be the core of my identity, so YMMV.

(I ask because I think burnout is a serious problem and its seriousness is probably generally under-appreciated in this community)

It's bizarre isn't it

Very much hoping the board makes public some of the reasons behind the decision.

This is wonderful.

I'd love to see a writeup of what happened in 2018 if you're willing to share.

One thing I hadn't realised is that Ilya Sutskever signed this open letter as well (and he's on the board!).

An open letter from 500 of ~700 OpenAI employees to the board, calling on them to resign (also on The Verge).

Suggests there's an enormous amount of bad feeling about the decision internally. It also seems like a bad sign that the board was unwilling to provide any 'written evidence' of wrongdoing, though maybe something will appear in the coming days.

But all told it looks pretty bad for EA. Seems like there's an enormous backlash online - initially against OpenAI for firing everyone’s favourite AI CEO, and now against “EA” “woke” “decelerationist” types.[1][2]

It’s also seemed to trigger a flurry of tweets from Nick Cammarata, saying that EAs are overwhelmingly self-flagellating and self-destructive and that EA caused him and his friends enormous harm. I think his claims are flatly wrong (though they may be true for him and his friends), and some of the replies seem to agree, but it has 500K views as I publish.

Seems like the whole episode (combined with at least one prominent EA seemingly saying it’s emblematic dreadful and toxic) has the potential to cause a lot of reputational damage, especially if the board chooses not to clarify its actions (although it's possibly too late for that).

  1. ^

    https://x.com/brian_armstrong/status/1725924114190536825?s=46

  2. ^

    https://x.com/atroyn/status/1725937945444757720?s=46

I don't understand the strategy of creating a lower risk business in order to fund a higher risk business though: if you are aligned with your investors (and if your goal is "make money" then you probably are aligned), then it seems strictly better to use their money instead of your own?

Second-time founders (at least in my experience, and in the UK/Europe) have a much easier time getting funding for their businesses. Certainly as a first-time founder our experience of getting funding has been like pulling teeth, despite decent traction and ARR. With greater access to capital I'd expect a higher chance of building a very large company. So in essence, the goal of one's first venture might be to just get to some form of exit to provide the cachet for then starting the next thing, rather than going for as big an exit as possible.

Generally speaking, should tech people start startups and EtG?

For those that have done so: would you advise going for broke and trying to make those startups as big as possible? Or optimise for something more sustainable that can be exited to generate cash, and then start something else higher-risk?

Since you offered: 0.015% tretinoin + niacinamide 4% + urea 5% moisturiser at night, SPF50 in the morning. What else should I be doing?

“There's no life bad enough for us to try to actively extinguish it when the subject itself can't express a will for that”

Agreed that this seems nonsensical on its face.

Load more