Archived version: https://archive.ph/2wG0k#selection-1743.0-1743.242

Edit: this is getting lots of downvotes. I don't mind this, because the article isn't great, and probably isn't a good use of most people's time. I just thought it was a relevant article to flag, given it explicitly mentions the possibility of an opportunistic power grab in the EA space.


The most relevant quote:

"This means our faction (more conservative, pronatalist, long-termist-civilization-building-focused, likely to self fund) is now 100X more likely to become a real, dominant faction in the EA space," Simone wrote in a text message on November 12.

Today, as I was doomscrolling through my favorite good-faith EA criticism from Timnit Gebru, I saw this article shared. Of course, I took the bait. It's a long article, and EA/longtermism are only mentioned a few times, but I have pasted a few relevant quotes over.

"I do not think humanity is in a great situation right now. And I think if somebody doesn't fix the problem, we could be gone," Malcolm half-shouted as he pushed his sniffling 18-month-old, Torsten, back and forth in a child-size Tonka truck.

Along with his 3-year-old brother, Octavian, and his newborn sister, Titan Invictus, Torsten has unwittingly joined an audacious experiment. According to his parents' calculations, as long as each of their descendants can commit to having at least eight children for just 11 generations, the Collins bloodline will eventually outnumber the current human population. 

If they succeed, Malcolm continued, "we could set the future of our species."

 

Malcolm, 36, and his wife, Simone, 35, are "pronatalists," part of a quiet but growing movement taking hold in wealthy tech and venture-capitalist circles. People like the Collinses fear that falling birth rates in certain developed countries like the United States and most of Europe will lead to the extinction of cultures, the breakdown of economies, and, ultimately, the collapse of civilization. It's a theory that Elon Musk has championed on his Twitter feed, that Ross Douthat has defended in The New York Times' opinion pages, and that Joe Rogan and the billionaire venture capitalist Marc Andreessen bantered about on "The Joe Rogan Experience." It's also, alarmingly, been used by some to justify white supremacy around the world, from the tiki-torch-carrying marchers in Charlottesville, Virginia, chanting "You will not replace us" to the mosque shooter in Christchurch, New Zealand, who opened his 2019 manifesto: "It's the birthrates. It's the birthrates. It's the birthrates."

 

While pronatalism is often associated with religious extremism, the version now trending in this community has more in common with dystopian sci-fi. The Collinses, who identify as secular Calvinists, are particularly drawn to the tenet of predestination, which suggests that certain people are chosen to be superior on earth and that free will is an illusion. They believe pronatalism is a natural extension of the philosophical movements sweeping tech hubs like the Silicon Hills of Austin, Texas. Our conversations frequently return to transhumanism (efforts to merge human and machine capabilities to create superior beings), longtermism (a philosophy that argues the true cost of human extinction wouldn't be the death of billions today but the preemptive loss of trillions, or more, unborn future people), and effective altruism (or EA, a philanthropic system currently focused on preventing artificial intelligence from wiping out the human population).

 

In February, the PayPal cofounder Luke Nosek, a close Musk ally, hosted a gathering at his home on Austin's Lake Travis to discuss "The End of Western Civilization," another common catchphrase in the birth-rate discourse.

 

These worries tend to focus on one class of people in particular, which pronatalists use various euphemisms to express. In August, Elon's father, Errol Musk, told me that he was worried about low birth rates in what he called "productive nations." The Collinses call it "cosmopolitan society." Elon Musk himself has tweeted about the movie "Idiocracy," in which the intelligent elite stop procreating, allowing the unintelligent to populate the earth.

 

Musk was echoing an argument made by Nick Bostrom, one of the founding fathers of longtermism, who wrote that he worried declining fertility among "intellectually talented individuals" could lead to the demise of "advanced civilized society." Émile P. Torres, a former longtermist philosopher who has become one of the movement's most outspoken critics, put it more bluntly: "The longtermist view itself implies that really, people in rich countries matter more."

A source who worked closely with Musk for several years described this thinking as core to the billionaire's pronatalist ideology. "He's very serious about the idea that your wealth is directly linked to your IQ," he said. The source, who spoke on the condition of anonymity for this article, also said Musk urged "all the rich men he knew" to have as many children as possible. 

Musk's ties to the EA and longtermist communities have been gradually revealed in recent months. In September, text logs released as part of Musk's legal battle with Twitter showed conversations between Musk and the prominent longtermist William MacAskill, who works at Oxford's Future of Humanity Institute, where Musk is a major donor. In the messages, MacAskill offered to introduce Musk to Sam Bankman-Fried, a now-disgraced cryptocurrency entrepreneur who had donated millions of dollars to longtermist organizations.

MacAskill has never explicitly endorsed pronatalism, and he declined to be interviewed for this article. He did, however, devote a chapter of his best-selling book, "What We Owe the Future," to his fear that dwindling birth rates would lead to "technological stagnation," which would increase the likelihood of extinction or civilizational collapse. One solution he offered was cloning or genetically optimizing a small subset of the population to have "Einstein-level research abilities" to "compensate for having fewer people overall."

Malcolm said he was glad to see Musk bring these issues to the forefront. "He's not as afraid of being canceled as everyone else," Malcolm told me. "Any smart person with a certain cultural aesthetics of their life is looking at this world and saying, 'How do we create intergenerationally, durable cultures that will lead to our species being a diverse, thriving, innovative interplanetary empire one day that isn't at risk from, you know, a single asteroid strike or a single huge disease?'"

 

The Collinses worry that the overlap between the types of people deciding not to have children with the part of the population that values things like gay rights, education for women, and climate activism — traits they believe are genetically coded — is so great that these values could ultimately disappear.

 

She also weighed in on the stunning implosion of Sam Bankman-Fried's crypto exchange FTX, which represented one of the largest financial hubs for the effective-altruism movement. The Collinses, who never directly associated with the top Democratic donor Bankman-Fried, spied an opportunity in his demise.

"This means our faction (more conservative, pronatalist, long-termist-civilization-building-focused, likely to self fund) is now 100X more likely to become a real, dominant faction in the EA space," Simone wrote in a text message on November 12.

I personally want nothing to do with a faction of people focused on genetic improvement and low birth rates in "Western Civilization", and I can see how longtermism rhetoric can be easily co-opted for this, and how this might implicate EA as a result. If this is a group of well-funded people who see the recent FTX events as an opportunity, that might be another reason EA/longtermism priorities might be shifted.

Can people more involved or have insight into the Austin and Bay Area EA communities give us a better sense of what this is about? And can people with a better understanding of longtermism clarify the extent of overlap here? And can Will or EA or longtermism clearly disassociate with this "faction" of people, if their goal is to opportunistically use EA as a way to further their means, or if there's significant disagreement in terms of values and ideology? 

34

0
0

Reactions

0
0

More posts like this

Comments26
Sorted by Click to highlight new comments since:

Hi! I strongly endorse pronatalism, and I will readily admit to wanting to reduce x-risk in order to keep my family safe.

Reviving this old thread to discuss the animal welfare objection to pro-natalism that I think is changing my mind on pro-natalism. I'm a regular listener to Simone and Malcolm Collins's podcast. Since maybe 2021 I've gone on an arc of first fairly neutral to then being strongly pro-natalist, third being pro-natalist but not rating it as an effective cause area, and now entering a fourth phase where I might reject pro-natalism altogether.

I value animal welfare and at least on an intellectual level I care equally about their welfare and humanity's. For every additional human we bring into existence at a time in history where humans have never eaten more meat per capita, on expectation, you will get years or--depending on their diet--perhaps even hundreds of years of animal suffering induced by the additional consumer demand for more meat. This is known as the meat-eater problem, but I haven't seen anyone explicitly connect it to pro-natalism yet. It seems like an obvious connection to make.

There are significant caveats to add:

  • this is not an argument against the value of having your own kids, who you then raise with appropriate respect for the welfare of other sentient creatures. While you can't control their choices as adults, if you raise them right, your expectation they will cause large amounts of suffering will be substantially reduced, potentially enough to make it a net positive choice. However, pro-natalism as a political movement aimed at raising birthrates at large will likely cause more animal suffering outweighing the value of human happiness it will create.
  • In the long term, we will hopefully invent forms of delicious meat like cultured meat that do not involve sentient animal suffering. The average person might still eat some farmed meat at the time, but hopefully, with delicious cultured meat options available, public opinion may allow for appropriate animal welfare for farmed animals, such that those farmed animals' lives are at least net positive. When that happens, pro-natalism might make more sense. But we don't know when cultured meat will appear. It is possible that widespread adoption is several decades away, in a slower AGI timeline world or where some form of cultural or legal turn prevents the widespread adoption of cultured meat even if it is technically possible.
  • I anticipate some people will argue that more humans will make the long term future go well because in expectation this will create more people going into the long term. I think this is a reasonable position to take but I don't find it convincing because of the problem of moral cluelessness: there is far too much random chaos (in the butterfly effect sense of the term) for us to have any idea what the effect of more people now will be on the next few generations.

I might make a top level post soon to discuss this, but in the meantime I'm curious if you have any clear response to the animal welfare objection to pro-natalism.

Great! I also want to reduce x-risk to keep my family safe. But do you also strongly endorse the claims listed in the article that are attributed to pronatalism, and do you consider yourself an EA / a longtermist?

i.e.
"fear that falling birth rates in certain developed countries like the United States and most of Europe will lead to the extinction of cultures, the breakdown of economies, and, ultimately, the collapse of civilization."

"worry that the overlap between the types of people deciding not to have children with the part of the population that values things like gay rights, education for women, and climate activism — traits they believe are genetically coded — is so great that these values could ultimately disappear."

Do you think focusing on birth rates in "Western Civilization" is a good way of creating 'intergenerationally, durable cultures that will lead to our species being a diverse, thriving, innovative interplanetary empire one day that isn't at risk from, you know, a single asteroid strike or a single huge disease?', and do you think it's something that longtermists should focus on?

I think the crux here might be differences around the definition of what pronatalism is. I'm clearly not advocating for antinatalism, and you can argue that the article is misrepresenting pronatalism.

But call it whatever you prefer-it doesn't change the concern of this group of people holding this set of views, the explicit claim around their "faction becoming a real, dominant faction in the EA space" and their well-funded nature. It should be seen as a clear potential risk for the EA movement going forwards, if it does not similarly endorse their ideology. If you're a pronatalist and don't think these views represent your movement, this concern applies to you too (perhaps even more so).
 

Re: "fear that falling birth rates [...] collapse of civilization."

No, this is not one of the things that scares me. Also, birth rates decline predictably once a nation is developed, so if this were a significant concern, it would end up hitting China and India just as hard as it is currently hitting the US and Europe.

Re: "worry that the overlap [...] could ultimately disappear."

No. Adoption of Progressive ideology is a memetic phenomenon, with mild to no genetic influence. (Update, 2023-04-03: I don't endorse this claim, actually. I also don't endorse the quoted "worry".)

Do you think focusing on birth rates in "Western Civilization" is a good way of creating 'intergenerationally, durable cultures that will lead to our species being a diverse, thriving, innovative interplanetary empire one day that isn't at risk from, you know, a single asteroid strike or a single huge disease?', and do you think it's something that longtermists should focus on?

I guess this intervention would be better than nothing, strictly speaking. The mechanism of action here is "people have kids" -> {"people feel like they have a stake in the future", "people want to protect their descendants"} -> "people become more aligned with longtermism". I don't think this is a particularly effective intervention.

Do you consider yourself a longtermist?

Yes.

Do you consider yourself an EA?

Eh, maybe.

Then it sounds like your idea of pronatalism and the Collinses idea of pronatalism looks quite different-if the article was written about the set of views you've expressed, I probably wouldn't be sharing it.

[comment deleted]1
0
0

You might be interested to know that they wrote an EA Forum post. The post seems to be quite long and has probably not been read fully by many people. They also mostly ignore immigration. I consider https://astralcodexten.substack.com/p/slightly-against-underpopulation to be generally a good rebuttal.

Oh wow, thanks! Well, that ticks off most of the bullet points from my earlier comment I guess. Yeah think they definitely have some takes I wouldn't want EA to be associated with.

Thanks for saying what you think and raising concerns that matter to you. I have a concern that this kind of post pushes towards what is and isn't acceptable in EA in ways that are symmetric rather than antisymmetric with respect to the truth, and doesn't support truth-seeking.

I wish this post were more about disagreements with claims or arguments about how longtermism does not support pronatalism (or perhaps this kind of pronatalism?) or digging into concerns that the community including pronatalist elements (or these kind) would be bad even if it made sense under longtermist priorities (or regardless of whether it did?).

I appreciate people raising concerns and saying what they think, but I also want to push us to engage with many parts of strange or uncomfortable or bad-seeming (or indeed bad!) worldviews when we evaluate them. (I think Nathan does a good job here of saying what evidence he does and doesn't see for concern around money in EA, and his feelings about it, regardless of whether they fit with the evidence. I think this post could benefit from having that kind of orientation).

In particular, I don't want it to be intra-EA politically unacceptable to explore questions raised by pronatalists or find their arguments compelling, that seems like a really bad outcome to me.

The main reason for the post is not to start a discussion on whether or not the Collins' brand of pronatalism is appropriate or a logical conclusion to longtermism. I already have a fairly settled view on this, and if it's the case that we sit here and discuss the merits of this type of pronatalism and suggest that it is a natural conclusion to longtermism, I'm simply going to reject longtermism, or at least many attempts to implement this in practice.

The main reason for the post is to serve as a PSA, to bring attention to a faction that may be at least opportunistically looking to gain influence within the EA space for their own goals, and not for the truth-seeking goals that you deem important, and let others decide whether they think this is what they want for EA, or whether this is the type of EA they want to be associated with. I'll note that as a result of this post, someone kindly pointed me (and other readers) to this post's existence. It has since been heavily downvoted, and a comment engaging with the object level points were left. (This has additional benefits for those of us who don't want anything to do with that brand of pronatalism, like having a place to point to next time some journalist associates these viewpoints with EA).

If a flat earther faction (especially one now successfully funded by the SFF) expressed a desire to become a dominant faction in the EA space to further their aims, I would make a similar post about this, and I don't think I should be expected to engage with debunking flat-earth viewpoints before making this post. It sounds like you disagree?

I really appreciate your straightforwardness and  honesty here. It would be very easy for you to give lip service to Chana's goals, but you said what you believe and I respect that. ... However I very much disagree with your conclusion. Most issues are not like flat-earthers. Most of the time you will have a much better time debating against the ideas you disagree with than writing PSAs about them.

This link explains some of my thinking in the area. Some of the ideas are applicable, but please don't take the obvious analogy too directly. (Also: apologies for the length of the piece. In an ideal world I would write more of my own thoughts.)

Seconding JP's point that I appreciate you being clear about your goals. Not sure what organization on that list is a flat earth one?

Thanks!

Pronatalist.org is Collins' organization that received funding from the SFF. I can see how that was unclear, apologies.

expressed a desire to become a dominant faction in the EA space to further their aims, I would make a similar post about this, and I don't think I should be expected to engage with debunking flat-earth viewpoints before making this post. It sounds like you disagree?

Interested in your view here!

I think the crux here is that everyone involved thinks it's obvious flat-earthers are wrong and we're working from the shared (implicit) assumption. 

I think that's not the case here (I don't even know what pro-natalism specifically claims or involves other than "more babies is good, pushing society in that direction is good", and I assume there's a huge range of thinking within that, some more libertarian, some more statist, some more racist, some more futurist, etc), and so I don't want to base a discussion on that as a shared assumption, and don't want to port in that assumption without addressing it.

Maybe you think it is equally obvious and that we should all think so, and if EA started debating flat earth you'd be (correctly) very concerned, some things are just obvious, but I've never figured out how to have a good conversation around "this is so obvious we shouldn't have a conversation about it" even though I think that's sometimes a reasonable point to make.

In my read, this post is not about whether having children (literal 'pro' 'natalism') is correct or not. I think having a debate about that is great, and I'm inclined towards the 'yes' side.

It's about pointing to signs suggesting the existence of power-seeking faction within EA, that (by their own admission) is attempting to coopt the movement for their own aims.

(Why the hedging in the previous paragraph: stating that your faction is "now 100X more likely to become a real, dominant faction" is not quite stating your intention to make it dominate, it just suggests it.)

I think coopting a broad altruism movement for particular people's pet causes (or selfish aims) is bad. There is a very real possibility that EA will be coopted by power-seekers and become another club for career-conscious people who want to get ahead. I support good attempts to prevent that.

This pointing is asymmetrical with respect to the question of whether the purported 'faction' in question is in fact a faction, and is in fact trying to coopt the movement.

Sorry I missed this comment, just got a recent notification on this post and realized. 

I don't even know what pro-natalism specifically claims or involves other than "more babies is good, pushing society in that direction is good", and I assume there's a huge range of thinking within that, some more libertarian, some more statist, some more racist, some more futurist, etc), and so I don't want to base a discussion on that as a shared assumption, and don't want to port in that assumption without addressing it.

I am specifically talking about the Collins' brand of pronatalism here as reported, as well as the possibility of a faction that are opportunistically seeking to co-opt the goals of the EA movement, rather than pronatalism that is as broad as you describe "more babies is good, pushing society in that direction is good".

In the link (as well as in the comments above), there is discussion of some of these views. Are you happy to defend these views as things that EAs should spend more time discussing and funding on the margin? 

"fear that falling birth rates in certain developed countries like the United States and most of Europe will lead to the extinction of cultures, the breakdown of economies, and, ultimately, the collapse of civilization."

"worry that the overlap between the types of people deciding not to have children with the part of the population that values things like gay rights, education for women, and climate activism — traits they believe are genetically coded — is so great that these values could ultimately disappear."

"What is really happening is that individuals from those families with sociological profiles amenable to movements like effective altruism, progressivism, or broad Western Civilisational values are being selected out of the gene pool."

"Current birth rate trends suggest traits on which the EA community relies, such as prosociality, are being differentially selected out of populations."

Do you think focusing on birth rates in "Western Civilization" is a good way of creating "intergenerationally, durable cultures that will lead to our species being a diverse, thriving, innovative interplanetary empire one day that isn't at risk from, you know, a single asteroid strike or a single huge disease?"

To be clear I'm not likely to engage on the object level even if you are happy to defend these points, I'm just not sure it's useful or relevant for me to spell out all the versions and parts of pronatalism I do support in order to make a post like this. I'm not even making a claim that any of pronatalism beyond what is reported is bad!

I'm just indicating that if there's a faction of people focused on genetic improvement and low birth rates in "Western Civilization" in the EA community, I can see how longtermism rhetoric can be easily co-opted for this, and how this might implicate EA as a result. I stand by that, and I also believe that it should be seen as a clear potential risk for the EA community's ability to fulfill its potential for impact! And if you're just a "more babies is good" pronatalist and don't think these views represent your movement, then this concern applies to you too (or perhaps even more so).

If you do think individual EAs, funders, or EA orgs should be spending more about ways to ensure Western Civilizational values remain competitive in the gene pool on the margin, that's entirely your prerogative! In that case consider this post as well as comments like these to be an indication of the kinds of tradeoffs your movement should take into account when asking people to engage with arguments like this. (I'm reminded of similar conversations and disagreements around Bostrom's letter and "truth-seeking" here).

I've never figured out how to have a good conversation around "this is so obvious we shouldn't have a conversation about it" even though I think that's sometimes a reasonable point to make.

Some things worth considering:

What kind of harms that these kinds of discussions could have? For example, should we dedicate more EA forum discussions to Bostrom's use of a racial slur and whether that's appropriate or not in the pursuit of truth-seeking? How action-guiding are these discussions? Should we spend more time on racial differences in IQ and the implications of this on EA community building? Are discussions of these topics actually a good indicator for people who deeply value truth-seeking, or just people who are edgy and want to outwardly signal this in a community where doing this is rewarded? Is this a strong signal, or is it noisy? Not everyone values outward signals of "truth-seeking" above all, especially if those signals can also be a cover-up for harmful ideas. Being open-minded doesn't mean you give the same space to every idea. Which groups of people are harmed, which groups of people might (wrongly) never join EA because of an unnecessary focus on these discussions? 

I think if you think talking about broad likely to be action guiding in ways that will benefit more than it will harm in expectation then it's worth talking about. Unfortunately, on priors, cause areas that sound like the ensuring the survival of Western Civilization combined with strong genetic determinism do not have a strong track record here, and I'm happy to dismiss by default unless there's good reason to believe this is misguided (whereas talking about feeling anxious about a sharp increase in EA funding does not have the same issue). 

I've stumbled here after getting more interested in the object-level debate around pronatalism. I am glad you posted this because, in the abstract, I think it's worthwhile to point out where someone may not be engaging in good faith within our community.

Having said that, I wish you had framed the Collins' actions in a little more good faith yourself. I do not consider that one quoted tweet to be evidence that of an "opportunistic power grab". I think it's probably a bit unhealthy to see our movement in terms of competing factions, and to seek wins for one's own faction through strategic means rather than through open debate. 

But I'm not sure Malcolm Collins is quite there, on the evidence you've said. It seems like he's happy that (according to him) his own favored cause area will get more attention (in the months since this has been posted, I don't think his prediction has proven correct). I don't think that's the same as actively seeking a power grab--it might just be a slightly cynical, though realistic, view that even in a community that tries to promote healthy epistemics, sociological forces are going to have an influence on what we do.

This article is not very well-reported and feels to me like it flits between subjects without making it clear how those people actually relate to each other and how much they influenced each other. Several of the most damning things it attributes to the Collinses (who I have never heard of) are paraphrased, so I am somewhat reserving judgment until I know whether that's actually what these Collinses said. 


That said:

According to his parents' calculations, as long as each of their descendants can commit to having at least eight children for just 11 generations, the Collins bloodline will eventually outnumber the current human population. 

If they succeed, Malcolm continued, "we could set the future of our species."



This is ridiculous. There are no subcultures which average eight children except ones like Quiverfull which have incredible attrition and are also very damaging to the people raised in them. To my knowledge no historical society has averaged eight children.  'as long as each of their descendants can commit to having at least eight children for just 11 generations' is not any more plausible than 'as long as we win the lottery at least one Friday every year'.  And I think 'planning out biological children for eleven generations' is suggestive of being pretty much incompetent at thinking about the future. But since this is paraphrased, I do want to be open to the possibility that the Collinses are pursuing something much less stupid than the author implies.

The text message claiming they intend to deliberately gain influence in the effective altruist movement does give me pause, because it's one of the only parts of the article that actually quotes them in their own words. I think effective altruism is doing stuff wildly more valuable than this, and should continue to do so, and should continue to not give the time of day to reasoning of the quality on display in this article. 

But I also think they're just hilariously wrong about the odds that an altruistic movement focused on either making the world better for people alive today or on surviving the next century will be coopted by 'well, if your children commit to having eight children...' 

I have kids, and want more of them. I think the gap between peoples' desired fertility (generally above 2 kids per women) and their achieved fertility (generally below) points to an important problem for some people to think about and work on, and I think EAs who want kids should have them. I think pro-natalism is fine, and most pro-natalists don't believe any of the stuff attributed to the Collinses in this article.  But I'm not optimistic these people would be worth working with, unless the article grossly misrepresented the quality of their thinking.

I think the “for just 11 generations” thing is obviously a joke. Obviously they can’t influence the culture of their kids by that much.

Same thing with the old Epstein “impregnate 20 women in a day” thing. It’s obviously impossible.

Yeah I basically agree with all of this!

But I also think they're just hilariously wrong about the odds that an altruistic movement focused on either making the world better for people alive today or on surviving the next century will be coopted by 'well, if your children commit to having eight children...' 

Nitpick: Longtermism does do work on beyond the next century right? But yeah I agree, I don't think EA will be co-opted like this, but I also don't think the co-opting will look like a group of people randomly coming up to EA and saying "hey you all should have 8 children, lets sign a pact".

But it could look something like:

  • Funding research and advocacy that advocates for increasing birth rates/demographics-related topics
  • Funding genetic engineering or IVF research or reproductive technologies
  • Proposing population collapse as a new cause area
  • Using shared language like "preserving future generations", "preventing technological stagnation", "preserving longtermist values/EA values" "ensuring moral progress" to justify things that might/could pass the bar on longtermist grounds, but also are beneficial for population growth.

It could also be the case that a large influx of funding means that the longtermism funding bar becomes much lower, such that this seems fine compared to a lot of other things that are being funded. After all, the other areas are talent constrained anyway, so it's not like funding this is harmful.

Of course, these things could be good to fund/research regardless, and I'm not suggesting these things shouldn't be funded on principle. But the point is that if there's a plan to co-opt the EA/longtermist movement or piggy back off its influence, it's not going to be obvious. This is getting a little conspiratorial (I probably would have dismissed it if it wasn't a literal quote), and none of these claims are particularly falsifiable, so it's probably not worth too much discussion time anyway. I'm just bringing this to the attention of people who should be caring about this, and people who might have more reliable information about the overlaps in subcultures to chime in.

One solution [MacAskill] offered was cloning or genetically optimizing a small subset of the population to have "Einstein-level research abilities" to "compensate for having fewer people overall."

Is this real? It sounds awful. (I haven't read the book)

From What We Owe the Future, near the end of Chapter 7 on Stagnation:

Advances in biotechnology could provide another pathway to
rebooting growth. If scientists with Einstein-level research abilities
were cloned and trained from an early age, or if human beings were
genetically engineered to have greater research abilities, this could
compensate for having fewer people overall and thereby sustain
technological progress. But in addition to questions of technological
feasibility, there will likely be regulatory prohibitions and strong social
norms against the use of this technology—especially against the
most radical forms, which would be necessary to multiply effective
research efforts manyfold. Human cloning is already within
technological reach, but as a global society we’ve decided not to go
forward with it—which may well be for the best, as human cloning
could plausibly increase the risk of bad value lock-in.
In sum, if we neither develop and deploy breakthrough technology
in time nor see a renewed population boom, it doesn’t look like we’ll
be able to keep quadrupling research effort. In that case, stagnation
seems likely.

Thanks. It's not as awful as the partial quote, but in my eyes still bad, and will make me think twice about associating with MacAskill.

Did you want to elaborate on this in any way?

Curated and popular this week
Relevant opportunities