Sorted by New

Topic Contributions


Thanks for writing! It sounds like part of your pitch is that there are some types of therapy which are much more effective than the types in common use. Scott's book review of all therapy books makes me pretty pessimistic about that. If you've read that post, do you have any thoughts?

More EAs should consider “non-EA” jobs

Hi Sarah! I broadly agree with the post, but I do think there's a marginal value argument against becoming a doctor that doesn't apply to working at EA orgs. Namely:

Suppose I'm roughly as good at being a doctor as the next-doctor-up. My choosing to become a doctor brings about situation A over situation B:

Situation A: I'm a doctor, next-doctor-up goes to their backup plan
Situation B: next-doctor-up is a doctor, I go to my backup plan

Since we're equally good doctors, the only difference is in whose backup plan is better—so I should prefer situation B, in which I don't become a doctor, as long as I think my backup plan will do more good than their backup plan. This seems likely to be the case for anyone strongly motivated to do good, including EAs.

To make a similar case against working at an EA org, you would have to believe that your backup plan is significantly better than other EAs' backup plans.

EDIT: I should say I agree it's possible that friction in applying for EA jobs could outweigh any chance you have of being better than the next candidate. Just saying I think the argument against becoming a doctor is different—and stronger, because there are bigger gains on the table.

In favor of more anthropics research

I had the opposite takeaway from the podcast. Ajeya and Rob definitely don't come to a confident conclusion. Near the end of the segment, Ajeya says, referring definitely to the simulation argument but also I think to anthropics generally,

I would definitely be interested in funding people who want to think about this. I think it is really deeply neglected. It might be the most neglected global prioritisation question relative to its importance. There’s at least two people thinking about AI timelines, but zero people [thinking about simulation/anthropics], basically. Except for Paul in his spare time, I guess.

Don't we need political action rather than charity?

When I first read it, I assumed that "meaningful, lasting change" meant "all the kinds of changes we want," rather than "any particular change." Maybe that's what the authors intended. But on rereading I think your interpretation is more correct.

How I got an entry-level role in Congress

Congrats! I don't know you but I'm very happy for you!

The networking was hard for me, and I often felt thrown off or wired up after my networking calls. It took me a long time to send each email.

I'm impressed you were able to persist in your job search while feeling this way. Did you have a particularly strong motivation toward your long-term goal, or were there other strategies you used to overcome these mental blockers?

An inner debate on risk aversion and systemic change

 Just broaden your conception of the team to the whole EA community, and stop worrying about how much of the “credit” is yours.

To me, this is the crux. If you can flip that switch, problem (practically) solved—you can take on huge amounts of personal risk, safe in the knowledge that the community as a whole is diversified.

Easier said than done, though: by and large, humans aren’t wired that way. If there’s a psychological hurdle tougher than the idea that you should give away everything you have, it the idea that you should give away everything you have to uncertain payout.

What if you and your friend bring the same skills and effort to the team, each of you taking big bets on cause areas, but your friend’s bets pay out and yours don’t? All credit goes to your friend, and you feel like a failure. Of course you do!—because effort and skill and luck are all hopelessly tangled up; your friend will be (rightfully) seen as effortful and skilled, and no one will ever be able to tell how hard you tried.

What can make that possibility less daunting?

  1. Notice when you’re thinking in terms of moral luck. Try to appreciate your teammates for their efforts, and appreciate them extra for taking risks.
  2. Get close with your team. There’s a big difference, I expect, between knowing you’re a cog in a machine and feeling the machine operating around you. A religious person who goes to church every day is a cog in a visceral machine. An EA who works in a non-EA field and reads blogs to stay up to date on team strategy might feel like a cog in a remote, nebulous machine.
Why I'm concerned about Giving Green

This was helpful to me (knowing nothing about climate policy) in terms of ideas about how to break down TSM's "transformative change" into more tractable parts. I guess I'd been treating "transformative change" and what Dan said about "fundamental uncertainty" as something like semantic stopsigns.

One thing I'm confused about:

Indeed, insofar as mass mobilization and climate grassroots activism are strongly tied to the Democratic party and making Democrats more ambitious on climate, it seems likely that the value of this advocacy has decreased due to the relative underperformance of Democrats in Congressional races and the likely less Democratic-leaning environment in the midterm elections.

I feel like I'm missing something—can you explain the mechanism here? Is this based on the possibility that TSM could hurt Democrats' election chances ("A stronger TSM could make it more likely that pressure on Democrats [...] leads to shifts towards the left that lead to losses in the House in 2022 and the loss of the trifecta"), and so it would have positive impact only when Democrats have a strong majority?

Money Can't (Easily) Buy Talent

Thanks for that clarification—maybe the $1m/year figure is distracting. I only mentioned it as an illustration of this point:

The post argues that the kind of talent valuable for direct work is rare. Insofar as that's true, the conclusion ("prefer direct work") only applies to people with rare talent.

Money Can't (Easily) Buy Talent

Thanks, Mark! I've been struggling to figure out what career goals I myself should pursue, so I appreciated this post.

Those considering EtG as their primary career path might want to consider direct work instead

I think this advice is missing a very important qualification: if you are a highly talented person, you might want to consider direct work. As the post mentions, highly talented people are rare—for example, you might be highly talented if you could plausibly earn upwards of $1m/year.

Regularly talented people are in general poor substitutes for highly talented people. As you say, there is little demand for them at EA organizations: "[Open Philanthropy is] not particularly constrained by finding people who have a strong resume who seemed quite aligned with their mission." (More anecdotal evidence: "It is really, really hard to get hired by an EA organisation.")

In other words, EA orgs value regularly talented people below the market rate—that’s one reason those people should prefer earning to give instead of direct work. (On the other hand, maybe there are opportunities for direct work at non-EA organizations that constitute sufficient demand?)

As a probably regularly-talented person myself, I'm particularly interested in the best course of action here. Rather than "earn to give" or "do direct work," I think it might be "try as hard as you can to become a highly talented person" (maybe by acquiring domain expertise in an important cause area).

One more thing:

Most people suffer extremely sharp diminishing returns to large sums of money [...] As people have more money, their desires shift: work-life balance, passion, location, etc. If someone is passionate about their work, no amount of money may be sufficient.

The flip side is that if you value money/monetary donations linearly—or more linearly than other talented people—then you’ve got a comparative advantage in earning to give! The fact that "people don't value money" means that no one's taking the exhausting/boring/bad-location jobs that pay really well. If you do, you can earn more than you "should" (in an efficient market) and make an outsize impact.