Hide table of contents

This article was written as part of EA Cambridge Project-based-fellowship.

 

Biosecurity, biosafety, and biorisk traditionally referred to bioterrorism, laboratory containment, and the management of biological hazards within lab settings (Renault et al., 2021; Gao et al., 2025; CDC, 2024; ISO 35001, 2019). These concepts have since expanded to encompass biological risk in agriculture, the environment, and public health. This expansion now includes biotechnology, where democratization has enabled consumer access to DNA synthesis, CRISPR kits, and other biological tools (Jackson et al., 2019; Wheeler et al., 2024; Guerrini et al., 2022). This has produced a diverse landscape of actors, applications, and risk profiles that traditional frameworks, designed for formal lab settings, were not built to govern. One such gap is DIY biology, which I examine here as an emerging and under-governed frontier of biotechnology democratization.

 In this article, I first provide an overview of DIY biology and how the regulatory environment has impacted its growth so far. I then discuss two things:

  1. Firstly, I argue that structural barriers are eroding and we should expect dramatic growth in DIY biology and general consumerization of biological technologies.
  2. Second, I look at similar ‘democratization’ developments in the adjacent fields and highlight what policy lessons we can learn from those fields.

     

What is DIY Biology?

DIY biology is a grassroots movement that promotes the accessibility of biological knowledge and biotechnological practices outside formal institutions (Landrain et al., 2013; Scheifele & Burkett, 2016; Eireiner, 2025). It is broadly classified into three practices: community biology, garage biology and biohacking. 

The movement has roots in the early 2000s, and has grown over time with few notable catalyst events. The publication of the human genome in 2003 made biological data publicly accessible at an unprecedented scale. Around the same time, the cost of lab equipment and DNA sequencing was falling rapidly, making entry far more accessible. Rob Carlson, a synthetic biologist, argued in Wired magazine that the era of garage biology had arrived and that a basic DNA lab could be assembled cheaply using surplus equipment (Carlson, 2005). The community became more organized when in 2008, two synthetic biologists founded DIYbio.org, an open platform for sharing protocols, techniques, and ideas (Meyer, 2013; Landrain et al., 2013). By the early 2010s, physical community labs — Genspace in Brooklyn and BioCurious in Silicon Valley — were being established to bring institutional-grade capabilities into non-institutional settings (Meyer & Vergnaud, 2020; Scheifele & Burkett, 2016). 

Today, the DIYbiosphere — an open-source database maintained by DIYbio.org — lists 64 community labs and over 30 active groups worldwide, though the real number is likely higher. Notably, the community is no longer a ragtag group of amateur biologists, but has a serious depth of knowledge. Surveys indicate that many members hold advanced degrees in natural sciences (Eireiner, 2025; Landrain et al., 2013). In other words, what began as a handful of garage enthusiasts has become a well-educated, globally networked community — and it is still growing.

 

Interplay between regulatory environments and DIY communities growth

However, the growth is uneven across countries and has been shaped by many factors, including regulatory environment, infrastructure, funding, and culture. The United States is the most mature ecosystem and not surprisingly also the least regulated. Canada is moderately permissive, with PHAC actively engaging DIY communities (Eireiner, 2025). But the ecosystem remains smaller, in part because DIY labs lack access to federal research funding and remain dependent on university partnerships for space and resources. The UK is the most revealing case regarding how slight changes in regulatory posture can significantly impact the DIY biology communities. UK's DIY biology scene peaked around 2016–2017 and has been declining since, with several community labs closing (Eireiner, 2025). This coincides with the several HSE notification requirements issued over time starting with the requirement for anyone modifying any organism to notify the regulator (HSE, 2014).

This uneven growth has a direct implication for risk as well. DIY biorisk concentrates where regulation is lightest and infrastructure is deepest — currently the United States. However, given biological risks are often difficult to contain (e.g., covid-19 virus), governance gap in one jurisdiction carries risk for the whole world potentially.

 

Structural barriers to DIY biorisk are eroding

Within the literature, the common arguments for why DIY biology does not pose a serious biorisk tend to fall into three categories:

  1. Amateurs lack the tacit knowledge — the hands-on skills, intuition, and troubleshooting ability — that only comes from formal training in the biological sciences.
  2. Establishing an operational molecular biology wet-lab outside an institution is prohibitively difficult and expensive.
  3. Regulatory and commercial gatekeeping mechanisms prevent amateurs from acquiring dangerous biological materials.

Each of these barriers is weaker than commonly assumed, and all three are eroding simultaneously.

The tacit knowledge gap within DIY biology may be narrower than assumed. One of the main arguments that has been used to argue that DIY biology risks might be overstated is that meaningful biological work depends on hands-on skills, troubleshooting, and familiarity with biological materials that amateurs simply do not have (Jefferson et al., 2014; Marris et al., 2014). However, there are few issues with this argument when applied to DIY biology. Firstly, this argument treats the gap as static and implicitly assumes that if someone has not formally trained as a biologist, they can not pick up relevant skills over time by being embedded within the DIY community. Indeed, the ecosystem surrounding DIY biology can itself be a vehicle for closing this gap. The community structure of the DIY lab means that it contains both novices and trained professionals (Landrain et al., 2013; Eireiner, 2025). Furthermore, community labs offer a variety of hands-on workshops. On the theoretical side, a number of open resources (e.g., coursera, MIT OCW) exist that provide full courses in molecular biology, synthetic biology, and genetic engineering. There also exist open-source project repositories that walk users through protocols like E. coli GFP expression step by step. And free tools for protein and genomic analysis are publicly available, though in my assessment they do require intermediate-level knowledge to use effectively. None of this guarantees sophisticated capability on its own, but it does mean that someone sufficiently motivated can progressively build the necessary knowledge and competencies to carry out a wet-lab lab without ever being in the formal training system.

The operational wet-lab barrier has been crossed. A second prominent argument against DIY biorisk has been the difficulty of establishing an operational molecular biology wet-lab, especially in isolation. The key obstacles include acquisition of permits for legal handling of biological materials, sourcing reagents and chemicals, infrastructure and containment requirements, financial costs, and waste disposal (Scheifele & Burkett, 2016; Eireiner, 2025; WHO, 2020; CDC/NIH BMBL, 2020). Upon research, I estimate that a plausible home wet-lab could be established for as little as $16,000–( Won't be sharing the methodology docs here due to infohazard risk). And the 2026 discovery of an illegal home wet-lab in Las Vegas provides proof that these barriers — regulatory, financial, logistical — have in fact been crossed (AP, 2026; ABC News, 2026).

Nucleic acid screening as a weakening chokepoint. If the knowledge barrier is eroding, the next structural safeguard is nucleic acid synthesis screening — the point at which a supplier checks whether an ordered sequence poses a biosecurity concern. That chokepoint, however, rests largely on voluntary compliance. Independent commercial sequence suppliers are not uniformly required to screen orders, and those that do face real barriers: the financial cost of screening processes, the expertise required to run them (molecular biologists, computing personnel), and consumer confidentiality concerns (Wheeler et al., 2024; Crawford et al., 2024; Kane & Parker, 2024; Rose et al., 2024). If one supplier declines an order, the customer can turn to a less regulated one (Wheeler et al., 2024; Kane & Parker, 2024). 

A Common Mechanism framework was proposed in 2024 to address these gaps — screening nucleic acid sequences ≥200 base pairs against a list of sequences of concern, with eventual expansion to cover pathogenicity-enhancing sequences and benchtop synthesis devices (Wheeler et al., 2024; Sharkey et al., 2024). The framework was set to take effect in April 2025 as a federal funding condition. It never meaningfully landed. A May 2025 executive order paused implementation and directed a revised framework within 90 days; that deadline passed without new guidance (Arms Control Association, 2025). NIH announced it would still adhere to the 2024 version, but other agencies halted implementation. A peer-reviewed study documented the resulting "implementation gap" — finding that few institutions have screening capability, trained biosecurity reviewers, or resources to assess legacy constructs (Gillum & Moritz, 2025). Benchtop synthesis devices, meanwhile, remain largely outside regulatory reach (Arms Control Association, 2025). The screening chokepoint, in short, is not just voluntary — it is actively in regulatory limbo.

Another prominent gap in the existing screening frameworks is that they only screen for select regulated sequences. For instance, in 2018, a group of Canadian researchers were easily able to order a horsepox genome sequence, which is closely related to smallpox genome sequence, from a commercial provider, following the acquisition of the said digital template from an open repository and synthesized the largest de novo virus (Evans et al., 2018). The authors themselves acknowledged that any method that can be used to assemble horsepox virus could be used to construct smallpox. It is alarming that a DIY practitioner could do this and face no chokepoint intervention.

 

Democratization of adjacent technologies may compound DIY biorisk

The barriers examined above are not eroding in isolation. Technologies developed and democratized in adjacent fields are accelerating the erosion — and in some cases creating entirely new risk pathways. AI is the most significant of these, but it is not the only one.

The AI factor

AI is a general-purpose technology, and its effects on DIY biorisk are not confined to a single barrier. It simultaneously erodes the knowledge gap that separates amateurs from trained biologists, and undermines the supply-side controls that are supposed to prevent dangerous materials from being synthesised in the first place.

On knowledge, there is increasing evidence that access to AI can compensate for deficiencies in knowledge. A 2024 OpenAI internal assessment found that AI enables certain biology-related tasks — such as locating and synthesizing technical information that would previously have required specialist knowledge to find and apply. The effect has been more pronounced with newer generations of LLMs. A RAND evaluation of eight frontier models found that an agent built on OpenAI's o3 designed eGFP DNA segments that were then physically validated in the lab, and that newer models produced sequences that passed all scoring criteria including expert review. Esvelt and colleagues at MIT demonstrated the knowledge risk more concretely: non-scientist students, given access to GPT-4 and other commercially available chatbots, identified four pandemic-capable pathogens, found reverse genetics methods, and located synthesis companies unlikely to screen orders — all within one hour.

On supply-side controls, AI compounds the screening problem. If the regulated sequence list is made public and open, malicious actors could work with non-regulated sequences and, particularly with AI-enabled protein design tools, potentially engineer novel biohazard sequences that fall outside the screening criteria (Rose et al., 2024). Wittmann et al. demonstrated this directly: using open-source AI protein design software, they generated over 75,000 variants of hazardous proteins and found that existing bioscreening tools could not reliably detect them. A follow-up study by the same group tested four screening tools that had since been patched and found that two of the four could still be evaded by AI-assisted sequence design.

The pattern extends beyond AI to other technologies like 3D printing and drones

AI is the most potent amplifier, but the pattern of adjacent technologies compounding biorisk is not confined to AI alone.

For example, consumerization of 3D printing has eroded some of the structural barriers to entry for DIY biology. Researchers have published open-source designs for functional wet-lab hardware — including liquid aspiration systems, micro-peristaltic pumps for microfluidics, custom reaction vessels for multistep chemical synthesis, and stackable cell culture inserts — all printable on consumer-grade 3D printers. Not all printed components are suitable for every biological application (material compatibility, sterilisation, and chemical resistance remain constraints), but the trajectory is clear: equipment that once required purchasing from specialist suppliers at significant cost can increasingly be manufactured at home for a few dollars per unit. Drones provide another example of how an unrelated technology can amplify biorisk in unforeseen ways. In China, malicious actors have used drones to drop infected material into pig farms to spread African swine fever, killing herds and disrupting international pork markets.

To summarize, biological risks can be amplified, or changed qualitatively, by developments in adjacent fields like AI, 3D printing, and drone technology. Current regulatory structures, organised around single-technology domains, are poorly equipped to see this, and governance frameworks that do not account for cross-technology amplification will systematically underestimate risk.

 

What should we expect as DIY biology scales?

The previous sections have argued that the structural barriers to DIY biorisk are weakening and that adjacent technologies are accelerating the erosion. The natural question is: what happens next? This is hard to say in definitive terms, but we can look at how other technologies — computing, 3D-printing — were transformed when they moved from institutional confinement to broad public access. The trajectories are unlikely to be identical, but the patterns are consistent enough to inform expectations about where DIY biology is heading.

Expectation 1: The user base will diversify, and safety assumptions built on institutional training will erode.

Computing provides the most salient example of this. From the 1950s through the 1970s, computers were confined to government agencies, universities, and large corporations, operated by trained specialists under strict oversight. A critical part of security was the competence, training, and trustworthiness of these professionals (FIPS 31, 1974; FIPS 41, 1975) — akin to formally trained bioscientists in their domain today. The advent of personal computing shattered this assumption. Millions of untrained or semi-trained users began operating these systems, and risks shifted from insider-dominated and institution-focused to distributed, accidental, and structurally harder to govern.

DIY biology suggests that a similar shift in user base is already underway. Community labs and online forums are widening participation, and the emergence of DIY CRISPR kits and benchtop DNA synthesisers has made it possible to carry out experiments outside any institutional setting. This matters because formal laboratories do not just provide equipment — they provide an infrastructure of safety: biosafety officers, containment protocols, training requirements, waste disposal systems, and incident reporting. A home lab or community lab typically has none of these. When something goes wrong in a formal lab — a spill, a containment failure, an accidental exposure — there are procedures, trained personnel, and reporting mechanisms. When the same thing goes wrong in a garage, there may be nothing.

The biohacking community offers early evidence that safety norms weaken as the user base expands beyond formally trained scientists. A growing subculture focused on personal health optimisation has normalised the self-injection of unregulated and unapproved peptides — often sourced from Chinese manufacturers, sold under "research purposes only" labelling, reconstituted at home, and dosed based on advice from Reddit forums and Discord groups. The FDA issued over 50 warning letters to peptide vendors and compounders in 2024–2025, and in June 2025 federal agents raided a major domestic peptide reseller. The compounds involved are not themselves high-risk pathogens — most are synthetic amino acid chains with unknown rather than known dangers. 

Expectation 2: Harmful biological designs, once digitally distributed, may prove hard to recall and harder still to govern at the physical layer.

The trajectory of 3D-printed firearms is highly illustrative of this point. In 2020, a pseudonymous designer released the FGC-9, a 3D-printed firearm built entirely from non-regulated parts, which has since been recovered in more than 12 countries including the UK, Australia, and New Zealand. In California, statewide ghost gun recoveries rose from 26 in 2015 to over 11,000 per year by 2021. The state imposed licensing requirements for 3D-printed firearm manufacture, but ultimately had to go further; in 2026, California sued websites distributing printable firearm blueprints in an attempt to govern the digital distribution layer itself.

A related pattern is emerging in biotechnology. Public repositories and open scientific platforms make a wide range of sequence data, genetic constructs, and experimental protocols broadly accessible online. This openness is not itself the problem — it is foundational to how modern biological science works, and restricting it in an indiscriminate way would carry significant costs to legitimate research. But it does mean that the digital layer is difficult to govern, and any harmful application that draws on publicly available knowledge inherits that difficulty. 

The key question, then, is whether the transition from digital access to physical fabrication can still be meaningfully governed. Currently, commercial DNA synthesis providers serve as the primary chokepoint — but as discussed above, that chokepoint is voluntary, unevenly applied, and in regulatory limbo. Benchtop DNA synthesisers could weaken it further. These devices play a role analogous to 3D printers in firearms manufacturing: oversight is largely confined to the point of purchase, while what is synthesised afterwards is, in most jurisdictions, effectively unmonitored. Most current devices synthesise up to 120 base pairs and are aimed at institutional use, but at least one manufacturer claims to have a device capable of synthesising gene-length DNA sequences. And the capability to synthesise DNA independently, without institutional-level resources, appears to be growing — the Kilobaser, for instance, is a personal synthesiser that claims to require no specialist training to operate.

The ghost gun trajectory suggests where governance pressure will increasingly move: upstream from physical suppliers to the digital layer of sequences, designs, and protocols. In biology, however, that digital layer is not fringe infrastructure — it is part of the ordinary architecture of legitimate science. That makes restrictions both more consequential and more contested than in firearms.

Expectation 3: Capabilities for causing biological harm will cascade downwards.

The history of computing offers a useful precedent. In the mainframe era, the capacity to inflict serious damage on computer systems was concentrated among a small number of skilled specialists operating within institutional settings. Security depended in part on the competence, training, and trustworthiness of those professionals — a reliance codified in federal standards of the time (FIPS 31, 1974; FIPS 41, 1975). As personal computing spread and the internet connected millions of devices, that model broke down. The expansion of the user base created a vastly larger and more distributed attack surface. Offensive tools and techniques began circulating openly — through forums, shared repositories, and eventually commercialised exploit kits — and capabilities that had once been concentrated among institutionally embedded actors became available to a much wider population. Over time, this supported scalable and increasingly organised forms of cybercrime that were difficult to monitor or govern.

The parallels to biology are imperfect but instructive. The combination of AI-enabled deskilling (as the OpenAI, RAND, and Esvelt studies discussed above suggest), open-source protocols, and increasingly accessible synthesis tools suggests that some capabilities once confined to institutional laboratories are beginning to diffuse outward. This does not mean that mass-casualty bioweapons become trivially easy — the gap between a localised biological incident and a pandemic-capable pathogen remains substantial. But the threshold for causing deliberate, non-trivial biological harm appears to be moving downward. The relevant question is not whether amateurs can replicate state bioweapons programmes, but whether motivated, moderately skilled individuals can increasingly acquire capabilities that existing biosecurity frameworks assumed would remain institutionally bounded. The evidence reviewed in this article — from AI-assisted screening evasion to the 2026 discovery of an illegal home wet-lab in Las Vegas — suggests that the capability threshold is indeed shifting, even if it remains far from the level required for highly sophisticated attacks.

As with computing, the greatest risk is probably not the sudden appearance of a single catastrophic actor. It is the gradual expansion of a distributed base of actors with enough capability to conduct unsafe experimentation, evade existing screening controls, or cause localised but meaningful harm — whether through deliberate misuse, recklessness, or accident. That kind of risk surface is structurally harder to monitor, attribute, and govern than the institutional landscape for which current biosecurity frameworks were designed.

Expectation 4: Reactive governance in (DIY) biology will exact a steep price.

Regulation of new technologies is, as a rule, reactive. Governance frameworks tend to follow a predictable sequence: capability diffuses first, harm materialises second, and regulation arrives third. This is not necessarily a failure of foresight — in many domains, it is difficult to know what to regulate until a technology has been deployed and its risks have become visible. The technologies discussed in this article have followed exactly this pattern. Computing security developed through decades of malware outbreaks, data breaches, and infrastructure attacks before governance frameworks adapted. Governance of 3D-printed firearms remains similarly reactive — California's 2026 lawsuit against blueprint-distribution websites came only after ghost gun recoveries had already risen sharply.

In both cases, the lag was costly but still broadly compatible with learning through harm. Malware can often be patched or contained after discovery; compromised systems can be isolated and rebuilt; firearms can be seized and removed from circulation. The damage may be severe, but the causal mechanisms are, in most cases, more bounded after the fact than those of a released biological agent.

Biology differs in ways that sharply reduce the margin for reactive governance. A released biological agent cannot simply be recalled or rolled back. Unlike a cyber exploit, it can self-replicate. Unlike a firearm, it can spread beyond its intended target. And attribution — determining who released what, when, and from where — may be impossible after the fact. These properties mean that the damage-driven learning model through which computing and firearms governance evolved is poorly suited to biological risk. By the time visible harm reveals the inadequacy of existing safeguards, the consequences may already be uncontainable.

This problem is especially visible in existing screening frameworks. Current synthesis screening relies heavily on comparison against known sequences of concern — governance anchored to previously identified risks. That model was more defensible when synthesis was concentrated in a small number of institutional suppliers. It becomes weaker as synthesis capabilities diffuse outward and as AI-enabled design tools make it possible to generate functional sequences that fall outside existing screens. The screening chokepoint, in other words, was built for a more centralised and institutionally legible world than the one now emerging.

If biology affords less room for reactive learning than other technologies, then governance must become more anticipatory. That points toward models that assess potential functional risk rather than relying solely on matches to previously identified sequences, and toward embedding stronger safeguards into the systems and supply chains through which synthesis capability is delivered, rather than relying primarily on voluntary compliance after access has already diffused.

 

The positive case for DIY biology

The expectations outlined above paint a concerning picture. But any discussion of governance must also reckon with what DIY biology has produced — and what would be lost if the response to risk were blanket restriction.

DIY biology really does democratise biological research

Community labs like Genspace in Brooklyn, BUGSS in Baltimore, and SoundBio Lab in Seattle provide BSL-1 lab space and hands-on training to students, independent researchers, artists, and citizen scientists who would otherwise have no access to wet-lab work. Genspace, the first community biotechnology laboratory when it opened in 2009, has become a launchpad for work that would not have happened within traditional institutions. Early participants went on to found companies — OpenTrons, an open-source laboratory robotics company, started at Genspace — create bioart exhibited in galleries, and compete in the international iGEM synthetic biology competition. These are not fringe operations. Community labs have become a genuine pathway into biological science for people outside the university system — broadening who gets to do biology and what questions get asked.

Open sharing has bred innovation and created publicly beneficial biological technologies

The open-source model that characterises much of DIY biology has enabled research with direct public health relevance. The Open Insulin Project, started in 2015 at Counter Culture Labs in Oakland by Anthony Di Franco, a type 1 diabetic, aims to develop open-source protocols for producing affordable insulin analogs — bypassing the small number of pharmaceutical companies that dominate the market. The project has genetically engineered microorganisms to produce both long-acting and short-acting insulin analogs, and the project estimates that roughly $10,000 in equipment could be enough to produce insulin for 10,000 people. It remains in the R&D phase and has not yet produced FDA-approvable insulin, but it illustrates a broader dynamic: the same openness that creates biosecurity concerns also enables collaborative research that patent-protected models often cannot. BUGSS, for instance, was able to advance its own insulin work specifically because Counter Culture Labs' experimental results were freely available online. The open-source ethos has also produced tools that benefit the wider scientific community — companies like OpenTrons and Bento Bioworks emerged from the DIYbio movement, making affordable laboratory instruments available to schools and researchers in resource-constrained settings.

DIY biology extends scientific capacity into the community

Community labs have also contributed to environmental monitoring and citizen science, applying synthetic biology tools to problems that traditional institutions have been slow to address. BUGSS iGEM teams — made up of high school students from across the Baltimore region — have developed biosensors for detecting PCB contamination, winning medals at the international iGEM competition. SoundBio Lab in Seattle has hosted community projects including soil pathogen research with local gardeners in the city's P-Patch programme (a network of over 90 community gardens donating 17 tons of produce to food banks annually), prion research through journal clubs and computational modelling, and yeast evolution studies open to participants with no advanced scientific background. These projects demonstrate that DIY biology can serve as a bridge between professional research and community needs, generating locally relevant knowledge that might not emerge from conventional academic priorities.

None of this negates the risks discussed earlier in this article. But it does establish that DIY biology is not merely a threat to be governed — it is also a source of innovation, education, and public benefit that governance must be designed to preserve. The policy challenge is not whether to permit or prohibit amateur biology, but how to maintain the openness that makes projects like Open Insulin possible while preventing the recklessness and misuse that the erosion of structural barriers increasingly allows.


Governance principles for a distributed biological landscape

So far, this article has argued three things: that the structural barriers commonly cited as reasons to dismiss DIY biorisk — tacit knowledge, wet-lab difficulty, and supply-side controls — are all weakening; that adjacent technologies, particularly AI, are compounding the erosion; and that the trajectory of other democratised technologies gives us reason to expect that biological capability will continue to diffuse beyond institutions. At the same time, DIY biology produces genuine public goods. How should then governance of biological technologies, in general, and DIY biology, in particular, proceed?

This article does not propose a specific regulatory framework. But the analysis above does point toward constraints that any serious governance approach will need to navigate. The foundational challenge is that existing biosecurity frameworks were designed for a world of institutional actors. That world is changing – and to some degree, has already changed. Governance, therefore, must adapt to distributed capability without relying on institutional affiliation as a proxy for trustworthiness. At the same time, it must do so without destroying the openness that makes beneficial DIY biology possible. With that tension as the backdrop, four more specific principles emerge from the analysis.

Governance must be cross-technology, not siloed by domain. The compounding dynamics described in this article — AI enabling sequence design and screening evasion, 3D printing enabling hardware production, drones enabling delivery — cut across traditional regulatory silos. Governing biological risk effectively will require mechanisms that can see across domains — and institutions structured to match.

Biology lacks adequate international coordination — and governance gaps in one jurisdiction carry consequences for all. Biological risks do not respect borders. As the regulatory comparison earlier in this article showed, DIY biorisk concentrates where regulation is lightest and infrastructure is deepest — currently the United States. But protocols developed in American community labs circulate globally, and an accidental or deliberate release in one jurisdiction can cross borders before anyone knows it has occurred. Unilateral national regulation is necessary but insufficient.

Yet the international governance landscape for biosecurity remains fragmented and poorly matched to the current threat. Notably, the WHO's 2022 Global Guidance Framework highlighted DNA synthesis as a critical example of gaps in governance of biological risks. Some degree of binding international coordination — at minimum on synthesis screening standards and benchtop device oversight — is needed to prevent regulatory arbitrage from undermining whatever domestic safeguards individual countries put in place.

Biosecurity needs a technical layer, not just a regulatory one. As this article has argued, screening and oversight mechanisms that depend on voluntary compliance, institutional norms, or list-based sequence matching are being outpaced by the tools they are meant to govern. A complementary approach is to build biosecurity safeguards directly into the technologies themselves. There is precedent for this: modern printers and photocopiers contain pattern-recognition systems that detect and refuse to reproduce banknote designs — a technical control embedded at the hardware level, independent of the user's intent or institutional affiliation. An analogous approach for benchtop DNA synthesisers — devices that detect and flag potentially hazardous sequences before synthesis proceeds — would address one of the most significant governance gaps identified in this article: the absence of any oversight mechanism after a device is purchased. 

More broadly, the emerging field of technical AI safety offers a model worth studying. Researchers in that field have invested heavily in developing alignment techniques, capability evaluations, and monitoring tools that are built into AI systems rather than layered on top of them after deployment. Biosecurity could benefit from a similar shift in orientation: from governing the behaviour of users to building safety properties into the tools and platforms through which biological capability is accessed and exercised.

Governance must proactively identify and preserve chokepoints — not wait for them to erode. A recurring theme throughout this article is the weakening of existing chokepoints: the tacit knowledge gap is narrowing, supply-side screening is in regulatory limbo, and benchtop devices are bypassing commercial synthesis providers altogether. The pattern suggests that a reactive posture — waiting for a chokepoint to fail before seeking alternatives — is structurally inadequate for biological risk. Governance should instead be conducting forward-looking assessments of which chokepoints remain viable, which are weakening, and where new ones might be established. For instance, if commercial DNA synthesis screening becomes less effective as synthesis decentralises, where does the next viable intervention point lie? Possible candidates include the digital distribution layer (sequence databases, protocol repositories), the device layer (benchtop synthesiser software and firmware), or the supply chain for key reagents and biological materials. Identifying and investing in these chokepoints before the current ones fail entirely is preferable to the alternative: discovering, after a serious incident, that no effective intervention point remains.

AI should be deployed for defensive biosecurity. This article has focused primarily on AI as a risk amplifier. But AI is also the most promising tool available for strengthening biological defence. As argued earlier, existing screening approaches rely on matching sequences against databases of known threats — a model that misses novel sequences and is increasingly vulnerable to AI-enabled evasion. AI itself can help close this gap. Rather than screening against static lists, AI-enabled tools can detect both known and novel pathogens. The overarching policy implication is that governance of AI in biology should not be purely restrictive. If AI is making offensive biological capability more accessible, the response should include investing heavily in AI-enabled detection, surveillance, and countermeasure development — ensuring that the defensive applications of AI keep pace with, or outrun, the offensive ones.

4

0
0

Reactions

0
0

More posts like this

Comments
No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities