Hide table of contents

I was recently listening to a podcast discussion that included two people that had been involved in military special operations units -- one in the Navy SEALs and one in the US Army Special Forces. I was struck by their extremely high level of training, dedication, commitment, and overall ability -- but also by also how this had in large part been squandered on fighting a destructive and unproductive war in Afghanistan, supporting questionable CIA operations, and so on.

It occurs to me that people in the EA community are often working on much more important causes, but with far less training, commitment, personal skill, etc.

This leads to my question -- what would it look like if similar levels of training effort, funding, selection, etc. were going into preparing EAs to do as much good in the world as possible as is currently going into preparing elite military units? (I don't think that this would necessarily look much like elite military training, to be clear!)

If this first exercise comes up with anything that seems promising -- are there ways that we could potentially 80/20 this and get much of the goods without high costs?

30

0
0

Reactions

0
0
New Answer
New Comment


4 Answers sorted by

I think this will vary a lot depending on what kind of work you're aiming to do, but I could imagine a training programme for e.g. promising young grantmakers being very helpful

Charity Entrepreneurship tries to do this for entrepreneurs

Adding to that, Lucia Coulter of the Lead Exposure Elimination Project had high praise for Charity Entrepreneurship when I interviewed her:

Charity Entrepreneurship has [...] made a big difference – their support from the incubation program to now has helped with pretty much every aspect of our work. [...] Firstly they provided a two-month full-time incubation program, which I went through (remotely) in the summer 2020. This was where I decided to work on lead exposure (which was an idea researched and recommended by Charity Entrepreneurship), where I paired up with my co-founder Jack, and from where we received our initial seed grant. During the program we learnt a huge amount of extremely relevant and practical material – for example, how to make a cost-effectiveness analysis, how to make a six-month plan, how to develop a monitoring and evaluation strategy, how to hire, and a lot more. Since then Charity Entrepreneurship has provided LEEP with weekly mentoring and wider support through the community of staff, previous incubatees, and advisors. I highly recommend checking out the Charity Entrepreneurship incubation program if anyone is interested!

(emphasis mine; source)

Good point re: Charity Entrepreneurship.

I'm somewhat more skeptical of the grantmaking thing though because there are few enough positions that it is not very legible who is good at it, whether others currently outside the field could do better, etc.

I could be wrong -- I can point to specific things from some grantmakers that I thought were particularly good, for instance -- but it doesn't feel to me that it's the most amenable field for such a program. 

(Note that this is low-confidence and I could be wrong -- if there are more objective grantmaking skill metrics somewhere I'd be very interested to see more!)

5
Kirsten
Some trainable things I think would help with grantmaking: -knowledge of the field you're making grants in -making a simple model to predict the expected value of a grant (looking for a theory of change, forecasting the probability of different steps, identifying the range of possible outcomes) -best practices for identifying early signs a grant won't be worth funding, to save time, without being super biased against people you don't know or from a different background to you who eventually could do good work -giving quality feedback to successful and unsuccessful applicants -engaging with donors (writing up summaries of why you gave different grants, talking to people who are considering donating through your fund) -evaluating your grants to learn how closely what really happened matched your model It doesn't seem to me obviously less trainable then being a Navy seal

There's the CFAR workshop, but it's just a 4 day program. (Though it would take longer to read all of Yudkowsky's writing.)

I'm no expert, but in some plausible reading, US Military training is primarily about cultivating obedience and conformity. Of course some degree of physical conditioning is genuinely beneficial, but when's the last time a Navy Seal got into a fist fight?

For most of the EA work that needs to get done (at the moment), having an army of replaceable, high-discipline, drones is not actually that useful. A lot of the movement hinges on a relatively small number of people acting with integrity, and thinking creatively.

Instead of intense training processes, EA at the moment relies on a really intense selection process. So the people who end up working in EA orgs have mostly already taught themselves the requisite discipline, work ethic and so on.

My impression is that the people who end up working in EA organizations are not on the same tier of discipline, work ethic, commitment, etc. as elite military forces and are not really even very close?

I don't say that to disparage EA direct workers, I'm involved in direct work myself  -- but my sense is that much more is possible. That said, as you mention the amount of discipline needed may simply not be as high.

4
AppliedDivinityStudies
Yeah again, for highly creative intellectual labor on multi-decade timescale, I'm not really convinced that working super hard or having no personal life or whatever is actually helpful. But I might be fooling myself since this view is very self-serving.

I used to listen to the podcast of a former Navy SEAL and he argues that the idea of obedient drones is totally off for SEALs, and I got the impression they learn a lot of specialized skills for strategic warfare stuff. Here an article he wrote about this (haven’t read it myself): https://www.businessinsider.com/navy-seal-jocko-willink-debunks-military-blind-obedience-2018-6

I recently learned about Training for Good, a Charity Entrepreneurship-incubated project, which seems to address some of these problems. They might be worth checking out.

I think this is a great exercise to think about, especially in light of somewhat-recent discussion on how competitive jobs at EA orgs are. There seems to be plenty of room for more people working on EA projects, and I agree that it’s probably good to fill that opportunity. Some loose thoughts:

There seem to be two basic ways of getting skilled people working on EA cause areas:
1.  Selectively recruiting people who already have skills.
2. Recruiting promising people who might not yet have needed skills and train them. 

Individual organizations can choose both options, depending on their level of resources. But if most organizations choose option 1, the EA community might be underutilizing its potential pool of human resources. So we might want the community in general to use option 2, so that everyone who wants to be involved with EA can have a role—even if individual EA organizations still choose option 1. For this to happen, the EA community would probably need a program whereby motivated people can choose a skillset to learn, are taught that skillset, and are matched with a job at the end of the process. 

Currently, motivated people who don’t yet possess skills are placed into a jumble of 1-on-1 conversations, 80k advising calls, and fellowship and internship listings. Having those calls and filling out internship and fellowship applications takes a ton of time and mental energy, and might leave people more confused than they were initially. A well-run training program could eliminate many of these inefficiencies and reduce the risk that interested people won’t be able to find a job in EA. 

We can roughly rank skill-building methods by the number of people they reach (“scale”), and the depth of training that they provide. In the list below, “high depth” skill development could lead to being hired for that skill (when one would not have been hired for that skill otherwise), “medium depth” as warranting a promotion or increase in seniority level, and “low depth” as an enhancement of knowledge that can help someone perform their job better, but probably won’t lead to new positions or higher status.

  • Internal development within organizations, like Aaron Gertler mentioned (small scale, medium depth)
  • Internship/fellowship programs (medium scale, medium depth)
  • One-off workshops and lectures (small scale, low depth)
  • Cause area-specific fellowships, like EA Cambridge's AGI Safety Fellowship (large scale, low depth) 
  • A training program like the one I described above (large scale, high depth)
  • An EA university, as proposed here (large scale, high depth)

If we choose option 2, we probably want large scale, high depth ways to train people. I’m interested in hearing people’s thoughts on whether this is a good way to evaluate skill-building methods.

One caveat: there’s a lot more interest in working for the military than there is in working for EA orgs. Since this interest already exists, the military just needs to capitalize on it (although they still spend lots of money on recruitment ads and programs like ROTC). The EA community doesn’t even have great name recognition, so it’s probably premature to assume that we’d have waves of people signing up for such a training program—but it’s possible that we could get to that point with time.

Rethink Priorities has an internship program.

If I recall correctly, they got 700 applications in the first round so there could be a lot more funding for internship programs which have space to take that funding.

Comments4
Sorted by Click to highlight new comments since:

I recently read Can't Hurt Me by David Goggins, as well as Living with a SEAL about him, and found both pretty appealing. Also wondered whether EA could learn anything from this approach, and am pretty sure that this is indeed the case, at least for a subset of people. There is surely also some risk of his "no bullshit / total honesty / never quit" attitude to be very detrimental to some, but I assume it can be quite helpful for others.

In a way, CFAR workshops seem to go in a similar-ish direction, don't they? Just much more compressed. So one hypothetical option to think about would be to consider scaling it up to a multi-month program, for highly ambitious & driven people who prioritize maximizing their impact to an unusually high degree. Thinking about it, this does indeed sound at least somewhat like what Charity Entrepeneurship is doing. Although it's a pretty particular and rather "object-level" approach, so I can imagine having some alternatives that require similarly high levels of commitment but have a different focus could indeed be very valuable.

I think a related question is:

"How much less effective would a project have to be for it to be worth it in terms of possible effectiveness and training value?"

ie Is it worth there being moonshot with lower EV than say GiveDirectly, which might find great leaders?

This seems related to Ben Todd's recent comment that EA has a leadership bottleneck. If true, why is training more leaders not a top priority? Maybe I'm misunderstanding something. https://twitter.com/ben_j_todd/status/1423318856622346249

What makes you think it isn't a top priority to train more leaders?

Put another way, what is your current impression of EA's "top priorities", on a community building / professional development level?

(CEA is keen on giving people opportunities to run projects, they pay for books and other resources on professional development, and overall seem to care a lot about helping staff prepare for future leadership if they want to do so. I'd guess that Open Phil and other longstanding orgs are similar?)

Curated and popular this week
Garrison
 ·  · 7m read
 · 
This is the full text of a post from "The Obsolete Newsletter," a Substack that I write about the intersection of capitalism, geopolitics, and artificial intelligence. I’m a freelance journalist and the author of a forthcoming book called Obsolete: Power, Profit, and the Race to build Machine Superintelligence. Consider subscribing to stay up to date with my work. Wow. The Wall Street Journal just reported that, "a consortium of investors led by Elon Musk is offering $97.4 billion to buy the nonprofit that controls OpenAI." Technically, they can't actually do that, so I'm going to assume that Musk is trying to buy all of the nonprofit's assets, which include governing control over OpenAI's for-profit, as well as all the profits above the company's profit caps. OpenAI CEO Sam Altman already tweeted, "no thank you but we will buy twitter for $9.74 billion if you want." (Musk, for his part, replied with just the word: "Swindler.") Even if Altman were willing, it's not clear if this bid could even go through. It can probably best be understood as an attempt to throw a wrench in OpenAI's ongoing plan to restructure fully into a for-profit company. To complete the transition, OpenAI needs to compensate its nonprofit for the fair market value of what it is giving up. In October, The Information reported that OpenAI was planning to give the nonprofit at least 25 percent of the new company, at the time, worth $37.5 billion. But in late January, the Financial Times reported that the nonprofit might only receive around $30 billion, "but a final price is yet to be determined." That's still a lot of money, but many experts I've spoken with think it drastically undervalues what the nonprofit is giving up. Musk has sued to block OpenAI's conversion, arguing that he would be irreparably harmed if it went through. But while Musk's suit seems unlikely to succeed, his latest gambit might significantly drive up the price OpenAI has to pay. (My guess is that Altman will still ma
 ·  · 5m read
 · 
When we built a calculator to help meat-eaters offset the animal welfare impact of their diet through donations (like carbon offsets), we didn't expect it to become one of our most effective tools for engaging new donors. In this post we explain how it works, why it seems particularly promising for increasing support for farmed animal charities, and what you can do to support this work if you think it’s worthwhile. In the comments I’ll also share our answers to some frequently asked questions and concerns some people have when thinking about the idea of an ‘animal welfare offset’. Background FarmKind is a donation platform whose mission is to support the animal movement by raising funds from the general public for some of the most effective charities working to fix factory farming. When we built our platform, we directionally estimated how much a donation to each of our recommended charities helps animals, to show users.  This also made it possible for us to calculate how much someone would need to donate to do as much good for farmed animals as their diet harms them – like carbon offsetting, but for animal welfare. So we built it. What we didn’t expect was how much something we built as a side project would capture peoples’ imaginations!  What it is and what it isn’t What it is:  * An engaging tool for bringing to life the idea that there are still ways to help farmed animals even if you’re unable/unwilling to go vegetarian/vegan. * A way to help people get a rough sense of how much they might want to give to do an amount of good that’s commensurate with the harm to farmed animals caused by their diet What it isn’t:  * A perfectly accurate crystal ball to determine how much a given individual would need to donate to exactly offset their diet. See the caveats here to understand why you shouldn’t take this (or any other charity impact estimate) literally. All models are wrong but some are useful. * A flashy piece of software (yet!). It was built as
Omnizoid
 ·  · 9m read
 · 
Crossposted from my blog which many people are saying you should check out!    Imagine that you came across an injured deer on the road. She was in immense pain, perhaps having been mauled by a bear or seriously injured in some other way. Two things are obvious: 1. If you could greatly help her at small cost, you should do so. 2. Her suffering is bad. In such a case, it would be callous to say that the deer’s suffering doesn’t matter because it’s natural. Things can both be natural and bad—malaria certainly is. Crucially, I think in this case we’d see something deeply wrong with a person who thinks that it’s not their problem in any way, that helping the deer is of no value. Intuitively, we recognize that wild animals matter! But if we recognize that wild animals matter, then we have a problem. Because the amount of suffering in nature is absolutely staggering. Richard Dawkins put it well: > The total amount of suffering per year in the natural world is beyond all decent contemplation. During the minute that it takes me to compose this sentence, thousands of animals are being eaten alive, many others are running for their lives, whimpering with fear, others are slowly being devoured from within by rasping parasites, thousands of all kinds are dying of starvation, thirst, and disease. It must be so. If there ever is a time of plenty, this very fact will automatically lead to an increase in the population until the natural state of starvation and misery is restored. In fact, this is a considerable underestimate. Brian Tomasik a while ago estimated the number of wild animals in existence. While there are about 10^10 humans, wild animals are far more numerous. There are around 10 times that many birds, between 10 and 100 times as many mammals, and up to 10,000 times as many both of reptiles and amphibians. Beyond that lie the fish who are shockingly numerous! There are likely around a quadrillion fish—at least thousands, and potentially hundreds of thousands o
Recent opportunities in Career choice
63
· · 1m read