S

smithee

121 karmaJoined May 2018

Posts
1

Sorted by New

Comments
11

Thanks! My understanding of CC being controversial: Lomborg once was a member of Greenpeace, then became disillusioned with popular environmentalism and wrote the extremely controversial The Skeptical Environmentalist arguing against most popular environmental causes. The Economist and Wall Street Journal celebrated it as a fresh new look, while the Scientific American lambasted Lomborg as wrong and even scientifically dishonest. One Danish government commission accused Lomborg of fabricating data and plagiarism, while another criticized the previous commission's investigations.

I've read the book and tried to form my own view, but the rabbit hole is too deep. If anyone's interested in the object level question, try Slate Star Codex.

In any case, Lomborg is highly controversial in some circles and has had extremely high reputational stakes in the environmentalism debate for over a decade. So on second thought, I significantly agree with Jan: EA should be wary of close association with controversial figures (not to mention possibly unethical).

Separately, I think this gets to a more central question of EA's nature: Will we always demand truth at all costs, or is good enough really good enough? Will we work with pragmatic allies that don't share all of our underlying motivations? EA evolved in very high fidelity, academic-style circles, where truth-seeking and intelligence are paramount. But if doing good is the single objective, while truth is clearly extremely important, so is influence. GiveWell claims to have moved ~$500m or so at this point; CC is working in an arena with tens of billions at stake. Should we accept lower intellectual rigor if it means we can increase our scale 100x over?

I default to a commitment to truth, if only because lowering your standards is always possible at a later date, while regaining intellectual rigor is likely not. But it's certainly a question worth discussing.

More generally, it seems really strange that EA and the Copenhagen Consensus haven't been in closer contact. Their mission is very EA: "to address a fundamental, but overlooked topic in international development: In a world with limited budgets and attention spans, we need to find effective ways to do the most good for the most people." And importantly, they're very legitimate, established, and influential.

Bjorn Lomborg, head of Copenhagen, has a relatively high public profile and has been named in several rankings of top public intellectuals. Even better, the Copenhagen Consensus (CC) has quite a bit of political influence: they claim some responsibility for or influence over Denmark's $2.9 billion anti-HIV program, George Bush's $1.2B Malaria Initiative, and their research was cited by David Cameron in a $4B global nutrition pledge.

A quick brainstorm of ways EA could connect with CC:

  • 80,000 Hours could direct readers to work with CC.
  • GiveWell shares an interest in cost-benefit analysis of development interventions, and each organization likely has insights that the other has missed. They could also network for employees through each other.
  • The Global Priorities Institute, being (a) located in academia and (b) interested in cause prioritization, seems like a great match for CC. GPI wants economists to work on prioritization, CC has relationships with hundreds of academic economists who have written its papers.
  • CC has political influence that EA seems not to; EA (maybe) receives more public attention and can more easily publicize ideas.

To be clear, there has been EA contact with CC before. Will MacAskill is featured on CC's Testimonials page, and 80k has spoken to CC several times in years past. But it seems like there should be more awareness of CC as a massively influential EA-aligned organization with strong inroads in politics (aka, where EA seems to be the weakest).

Does anyone know of further collaboration with CC? Any good ideas on what collaborations could work?

Hi! Do you happen to know about the current AI Impacts hiring process?

1) I think it's important to try to specify exactly what 80k can improve. They're an extremely busy organization that doesn't have time for everything they'd like to do, so they can only improve if we can identify specific high-leverage uses of their time. General hopes for higher accuracy or helpfulness are likely not actionable.

2) I definitely agree with the worries about competition. I've been quite surprised to see how difficult it is to get hired at many EA orgs, often with <5% of applicants getting offers. Because people are often making years of plans based on thinking they have a realistic chance of working at these organizations, it's important that they understand their true chances. I think 80k should try to better publicize acceptance rates for certain jobs, and if possible the types of resumes and experience that are really necessary to be accepted.

Load more