461 karmaJoined Working (6-15 years)


EtG @ Google


Sorted by New


I'll ask the obvious awkward question: 

Staff numbers are up ~35% this year but the only one of your key metrics that has shown significant movement is "Job Vacancy Clickthroughs".

What do you think explains this? Delayed impact, impact not caught by metrics, impact not scaling with staff - or something else?

What does 'Big 3' refer to in "'Big 3' podcast engagement hours"?

>Cosmetics and skincare for those who (want to) look masculine

Any good resources in general, for the obvious requirement that I don't want to look like I'm wearing makeup?

Currently all I do is moisturise 

Is it at all fair to say you’re shifting your strategy from a “marathon” to a “sprint” strategy? I.e. prioritising work that you expect to help soon instead of later.

Is this move due to your personal timelines shortening?

I definitely think it's an (the most?) important argument against. Some of this comes down to your views on timelines which I don't really want to litigate here. 

I guess I don't know how much research leading to digital people is likely to advance AI capabilities. A lot of the early work was of course inspired by biology, but it seems like not much has come of it recently. And it seems to me that we can focus on the research needed to emulate the brain, and try not to understand it in too much detail.

That could happen. I would emphasise that I'm not talking about whether we should have digital minds at all, just when we get them (before or after AGI). The benefit in making AGI safer looms larger to me than the risk of bad actors - and the threat of such bad actors would lead us to police compute resources more thoroughly than we do now.

Digital people may be less predictable, especially if "enhanced", I think that the trade-off is still pretty good here in that they almost entirely approximate human values versus AI systems which (by default) do not at all. 

I agree shooting for digital people is a bad plan if timelines are short. I guess I'm not sure how short they would need to be for it not to be worth trying.

I think if we wanted to produce BCIs we should just shoot for that directly - doesn't seem like the best plan for getting to digital people is also the best plan for getting BCIs.

I think that insofar as neuroscience helps make AI, that just speeds up progress and is probably bad.

That person is Oliver Yeung and he has done a two part talk where he discusses this - main talk, Q&A.

(I spoke to him to okay sharing these, if any interviewer wants to speak to him then DM me and I can put you in touch)

there will be someone in the world whose full-time job and top-priority it is to figure out how to write a proposal, or give you a pitch at a party, or write a blogpost, or strike up a conversation, that will cause you to give them money, or power, or status


IMO, a reasonable analogy here is to the relationship between startups and VCs.

What do VCs do to weed out the lemons here? Market forces help in the long run (which we won't have to the same degree) but surely they must be able to do this to some degree initially.

Load more