Sorted by New

Wiki Contributions


Comments for shorter Cold Takes pieces

I think this is one of your best posts. I learned a lot, built new models of art, and laughed out loud multiple times.

lincolnq's Shortform

"it is unclear whether" can sometimes mean "I am skeptical that" or "I don't think". It annoys me when people use it this way. Unclear already has a good and useful meaning. We shouldn't dilute it.

The proper use of "unclear" is a sentence like this: "it's still unclear if the intervention worked". A quick heuristic: if the use of "unclear" is or could be prefixed by "still" without changing the meaning, it is probably ok :). Another way to view it -- if more information is likely to come out soon, then it's probably ok.

Some examples of usages of "unclear" I'd like to see less of:

  • "How do you think the willingness of key actors such as governments to tackle bio risks will change...? It's unclear whether we will see the right levels of political competence and focused engagement..." 1
  • "It's unclear how significant the extrinsic, welfare-oriented value of biodiversity even is" 2
  • "it is unclear how OpenPhil are comparing different causes, rather than looking out for giving opportunities across a variety of causes" 3
Why do you find the Repugnant Conclusion repugnant?

A world which supports the maximum number of people has no slack. I instinctively shy away from wanting to be in a world with resource limits that tight.

What would you do if you had a lot of money/power/influence and you thought that AI timelines were very short?

Hm, if I felt timelines were that short I would probably feel like I knew which company/government was going to be responsible for actually building AGI (or at least narrow it to a few). The plan is to convince such orgs to ask me for advice, then have a team ready to research & give them the best possible advice, and hope that is good enough.

To convince them: I would be trying to leverage my power/influence to get to a position where leaders of the AGI-building organization would see me as someone to consult for help if they had a promising AGI-looking-thing and were trying to figure out how best to deploy it.


  • if rich, donating lots of money to causes that such people care about and thus buying invitations to conferences and parties where they might hang out.
  • If otherwise influential, then use my influence to get their attention with similar results.
  • There might be other leveraged projects (like blogs, etc) that could generate lots of influence and admiration among the leaders of AGI-building orgs

Simultaneously, I would also be trying to join (or create, if necessary) some sort of think tank group comprising people who are the best for advice on short term AGI strategy. Again, power and money seem useful for putting together such a group - you should be able to recruit the best possible people with star power, and/or pay them well, to start thinking about such things full time. The hard part here is shaping the group in the right way, so that they are both smart and thoughtful about high stakes decisions, and their advice will be listened to and trusted by the AGI-building organization.

Assumptions / how does this strategy fail?

  • I cannot build the influence required:
    • I have to influence too many AGI builders (because I don't know which one is most likely to succeed), so my influence is too diluted
    • They are not influenceable in this way
  • AGI builders don't ask for the advice even if they want to:
    • maybe the project is too secret
  • advice can't solve the problem:
    • maybe there is an internal deadline - things are moving too fast and they don't have time to ask
    • maybe there are external deadlines, like competition between AGI builders, such that even if they get the advice they choose not to heed it
    • maybe the AGI building leadership doesn't have sufficient control over the organization, so even if they get advice, their underlings fail to heed it
  • advice is too low quality
    • I wasn't able to recruit the people for the think tank
    • They just didn't come up with the answer
Business Coaching/Mentoring For EA Organisations

This post says "EA orgs" but the linked page specifically says "charities". I assume this offer is restricted to non-profits, but I am commenting to check that was your actual intent!

Give Me Career Advice

(Founder of Wave here, who employs Ben Kuhn who you linked to.)

You should probably try to find early-stage projects locally that you can contribute to. If you find projects you like, you can make them succeed and it will be a rewarding experience. Don't index too heavily on expected EA impact early on -- it's worth considering whether the thing you're putting time into can be big/impactful someday, but I think it'll be better to just focus on things where you think you will resonate and can make a big difference to the project/company's success. Look specifically for team fit: you should enjoy working with the colleagues, but also complement their skills in a useful way.

(I'm writing all this based on our experience hiring Ben -- I think he complements our founding/exec team's skills in a really unique way, and that's why Wave is so resonant for him.)

If you can't do that or it doesn't satisfy, you should at least consider working remotely for Wave :). You won't make as much impact as Ben, but you will get to work on satisfying things for real people; depending on what it is that you like/don't like about remote work, you might get a lot of what you're looking for in social connection from our retreats (every couple of months you see teammates for a fairly intense week).

Comments for shorter Cold Takes pieces

Assuming that the high happiness reports from the Hadza are "real" (and not noise, sampling bias, etc), what might it be?

They have dramatically worse health and nutrition. Also worse "creature comforts" like cozy beds, Netflix and mulled wine. But maybe some combination of the following could be overcoming those drawbacks.

In the category of lifestyle/how you spend your time:

  • Social structure (small communities, much stronger social connection, more social time)
  • Work structure (more cooperation, more "meaning" in work due to knowing you're supporting your family directly / avoiding starvation for yourself and your loved ones)
  • Non-social leisure structure (no Reddit, no TV; no street noise; you're always out in nature)

Or internal experience:

  • Perhaps you'd have different dreams or fantasies?
  • No Instagram, no "keeping up with the Joneses" or social-status stress beyond your immediate community
  • Climate change, nuclear war, and x-risk presumably aren't a worry
  • Could sexual and romantic relationships be more fulfilling related to the small community?

Other ideas?

An update in favor of trying to make tens of billions of dollars

a mindset for getting money has drawbacks, for example it might promote patterns where ultimately people rationalize small, marginal projects for $1M.; Instead, maybe a useful alternative is to get genuinely interested in building something big and awesome, so a product mindset helps?

Yes. (Sorry for the bad writing, it was late and I was tired.)

I think the best entrepreneurs get a bit of a boost in motivation from the idea of becoming rich, but the more "rich"-oriented you are, the less likely it is that you will make billions of dollars.

Was there some feeling or realization that caused you to think you were more likely to be successful at entrepreneurship than others?

Hmm, I wanted to be an entrepreneur from the moment I understood it was possible. I don't think this is a necessary condition for success, but I think it gives a lot of energy towards trying (and especially trying multiple times if you fail). A small project I've had for the last 5+ years is figuring out who among my friends should be entrepreneurs, and trying to inject a mind-virus to get them to actually do it. My instinct is that you should be motivated by "impact" of some kind (it's ok if it's not purely altruistic); willing to work hard. You should be good at something too, but I think that can come with time if you are sufficiently motivated. In my case I was both good at coding and good at self-improvement, these things definitely compounded.

do you have some area or projects that might be interesting for people to know about?

Not as such, nothing to announce right now

under what circumstances would you considering mentoring or giving feedback on a deck

I don't really like reviewing decks. I'm generally happy to answer a few questions/give entrepreneurship advice over email; my email is pretty easy to find!

An update in favor of trying to make tens of billions of dollars

Hi! It's neat to be mentioned!

My motivation was not, and has never been about the money. I think it would have been too easy to be distracted by early <$5m acquihire opportunities, if I were looking to get paid a bunch of money. I realize that is not the point you are making (and that shooting for the moon might be worth it if you are motivated by money) but I do think a lot of people might see $1B and think "gosh, that sounds hard, maybe I could do $1m" and the answer is you can, sorta, but it is dumb and not worth it to try.

My motivations were more about the, um, "glory" -- I always wanted to build something big, I thought it was possible when few others did, and I also thought it would be really fun to try. I didn't perceive starting a company as "risky" at all, since I was otherwise a successful software developer, plus my parents were rich enough to support me. It might be harder to dive into building something big if you don't have that kind of support. I did also have altruistic motivations, although they were fairly weak in the first few years - they've grown a lot since!

Things I learned:

  • My first 2 years working on startups were working on failed ideas, but I learned an enormous amount about product, culture, teamwork and even coding which translated directly into early effectiveness once we hit on an idea that worked. Don't be afraid to pivot.
  • When we reached product market fit in 2014 I wrote this post about startup skills, which might be useful for others (note: is quite old now, don't endorse everything etc)

And my pledge is 10%, although I expect more like 50-75% to go to useful world-improving things but don't want to pledge it because then I'm constrained by what other people think is effective.

Which non-EA-funded organisations did well on Covid?

Patrick McKenzie isn’t a “group”, and he probably doesn’t need your money, but he did get ahead of coronavirus impacts in Japan successfully:

Load More