GSW

Gordon Seidoh Worley

@ PAISRI
1813 karmaJoined Aug 2017Working (15+ years)Oakland, CA, USA
paisri.org/

Bio

Participation
4

aka G Gordon Worley III

Comments
299

Still short on details. Note that this says "in principal" and "initial board". Finalization of this deal could still require finding 3 more board members to replace Helen and Tasha and Ilya who they'd still be happy with to have EA/safety voices. We'll have to wait to see what shakes out.

Yes. It's hard to find people who are poorer because of automation once we smooth over short-term losses.

What's easier to find is people who felt poorer because they lost status, even though they actually had more purchasing power and could afford more and better goods. But they weren't actually economically poorer, just felt poorer because other people got richer faster than them.

I guess it depends on what kind of regulation you're thinking of.

While it's true that the US and EU value individual liberty highly, these countries are also quite motivated to regulate arms to maintain their technological lead over other countries, for example by regulating the export of cyber, nuclear, and conventional weapons and putting restrictions on who can be part of their supply chain. Smaller countries have been more willing to treat other countries as equals when it comes to arms and not worry about the possibility of attach since they feel little threat from each other if they don't share borders, whereas the US and EU have global concerns.

Based on this I expect the US and EU to be more likely to engage in the type of regulation that is relevant for controlling and limiting the development of TAI that poses a potential threat to humans, though you're right to point out that countries like China are more likely to impose regulations to control the near-term social harms of AI, whereas the US and EU are more likely to take a hands off approach.

So tl;dr: the US and EU will impose regulations where it matters to slow down the acceleration of progress so they can maintain control, but other countries might care more about social regulation that's comparatively less relevant for time to TAI.

Mostly seems like a good thing to me. The more chips needed to build AI are dependent on supply chains that run through countries that are amenable to regulating AI, the safer we are. To that end, routing as much of the needed chip supply chain through the US and EU seems most likely to create conditions where, if we impose regulations on AI, it will take years to build up supply chains that could circumvent those regulations.

Personally I downvoted this post for a few reasons:

  • insufficient details to evaluate the claims
  • claims are not stated clearly
  • I found it hard to follow because it contained various abbreviations with no explanation of them
  • assumes reader has context that is not presented in the post

To me this reads more like publicly posting content that was written only with an audience of folks working at CEA or similar orgs in mind. So I downvoted because it doesn't seem worth a lot of people reading it since it's unclear what value there is there for them. This isn't to say the intended message isn't worthwhile, only that the presentation in this particular post is insufficient.

I'd very much like to read a post providing evidence that there were many instances of sexual assault within the community if that's the case, especially if it's above the baseline of the surrounding context (whether that be people of similar backgrounds, living in similar places, etc.). And if CEA has engaged in misconduct I'd like to know about that, too. But I can't make any updates based on this post because it doesn't provide enough evidence to do so.

This is a short note to advise you against something called CouponBirds.

I don't know much about them other than they're creating lots of spam in a bid to soak up folks who are bummed about the loss of Amazon Smile. They've sent me emails and posted spammy comments on posts both here and on Less Wrong (I report them each time; they keep creating new accounts to post).

If you were thinking of using them, I encourage you to not because we should not support those who spam if we want to live in a world with less spam.

And EA is the social life, not professional for a lot of us.

I want to say something specifically in response to this.

It's great that EAs can be friends with each other. But EA has a mission. It's not a social club. We're here to do good better. Things that get in the way of doing good better should be dropped if we actually care about the mission.

The trouble is I know lots of people have impoverished social lives. They have maybe one or maybe two communities to build social bonds within. So when you're in that stance the natural thing is to try to extract as much value from the one community you have, whether or not that is well advised.

The better strategy is to get some more communities!

I've seen this a lot within rationalist spaces. People come in and want to make rationality their whole identity. This isn't a unique phenomenon. People try to do it with religion, hobbies, all kind of stuff. It's almost always a mistake. We have the common wisdom against putting all your eggs in one basket for a reason.

Be friends with fellow EAs. Have a social life with some of them. But don't let that be the whole social scene! That's why we're in this mess in the first place! We've got people who've mixed up their only personal and professional settings and now trouble in one means trouble in all of it. People's whole lives fall apart because one part goes bad and they have no where to turn. And that's just how it is, there's nothing unusual about it; would happen anywhere and anytime someone wraps their entire life around a single thing. That's not the way to have resilient social bonds that enable a person to do the most good in the world. It's a way to do some good for a while until something goes wrong and then burnout or be ostracized and then do less good.

(I know this is perhaps a bit ranty, but I see people fucking this up all the time in EA and I just want to shake some sense into everyone because this is extremely obvious stuff that nerd-like people mess up all the time.)

I've not tried to quantify this, but I've lived in a bunch of rationalist/EA houses. I've seen the dynamics up close. The downsides are very large and massively outweigh the upsides based on what I've seen. The only thing that makes people think the upsides outweigh the downsides is, I suspect, that they are desperate.

This isn't really weird, though. This is just what seems to happen in lots of communities. Dating within a community is usually dangerous to the community, and this holds for lots of communities. This is a fully general phenomenon among humans: exogamy is the strategy most often adopted by societies that grow, expand, progress, and make the world better; endogamy by societies that isolate and rarely change. Given that the goal of EA is to make the world better, we should have a strong prior against endogamy being a successful strategy.

Conflicts of interests is a way to put it, but I think it massively undersells it.

To me the thing not dating in EA (or at your company) is it upholds an extremely valuable professional norm of preventing your personal and work life from colliding in messy ways. The trouble is that breakups happen, they are messy, people do and say things they will regret, and you want as much separation between the personal and professional parts of your life in such a situation. That way only a part of your life is on fire. If you're like many people and only really have personal and professional lives (and don't have, say, a religious life) then you may find your whole world has fallen apart.

The downside risk is high. The upside is low. The world is full of people. Go out and find some of them who aren't going to put you at risk of wrecking both parts of your life at once.

To be clear, I am mostly saying don't date other EAs most of the time, especially if you are doing more than small scale earning to give. If you plan to work in EA, then EA is your office. EA is too small to think of it as an ecosystem where people can find other opportunities. There's one EA game in town. That's the place I think it's fraught to date.

Load more