Hide table of contents

We just published an interview: Alison Young on how top labs have jeopardised public health with repeated biosafety failures. Listen on Spotify or click through for other audio options, the transcript, and related links. Below are the episode summary and some key excerpts.

Episode summary

Rare events can still cause catastrophic accidents. The concern that has been raised by experts going back over time, is that really, the more of these experiments, the more labs, the more opportunities there are for a rare event to occur — that the right pathogen is involved and infects somebody in one of these labs, or is released in some way from these labs.

And what I chronicle in Pandora’s Gamble is that there have been these previous outbreaks that have been associated with various kinds of lab accidents. So this is not a theoretical thing that can happen: it has happened in the past.

- Alison Young

In today’s episode, host Luisa Rodriguez interviews award-winning investigative journalist Alison Young on the surprising frequency of lab leaks and what needs to be done to prevent them in the future.

They cover:

  • The most egregious biosafety mistakes made by the CDC, and how Alison uncovered them through her investigative reporting
  • The Dugway life science test facility case, where live anthrax was accidentally sent to labs across the US and several other countries over a period of many years
  • The time the Soviets had a major anthrax leak, and then hid it for over a decade
  • The 1977 influenza pandemic caused by vaccine trial gone wrong in China
  • The last death from smallpox, caused not by the virus spreading in the wild, but by a lab leak in the UK
  • Ways we could get more reliable oversight and accountability for these labs
  • And the investigative work Alison’s most proud of

Producer and editor: Keiran Harris
Audio Engineering Lead: Ben Cordell
Technical editing: Simon Monsour and Milo McGuire
Additional content editing: Katy Moore and Luisa Rodriguez
Transcriptions: Katy Moore

Highlights

The case of the found smallpox vials

Alison Young: Around the same time the CDC was having all kinds of incidents in 2014, in the middle of all of that, there was a cold storage room on the campus of the National Institutes of Health, just north of Washington DC, where they were moving around some old cardboard boxes. And they look inside and they see all of these little, tiny, very fragile vials from decades ago that are labelled in typewriter print with various pathogens’ names on them. And it’s powdered material. And as they’re going through these glass vials, they see some that are labelled as variola.

Luisa Rodriguez: Which, just to be totally clear, variola is the pathogen that causes smallpox. So go on: they found vials of smallpox in a box in a storage room?

Alison Young: Exactly. In an unlocked storage room. So this should have been incredibly concerning, because smallpox is incredibly deadly. It has been eradicated from the planet and smallpox virus is only supposed to be found under treaties in two labs in the world: one is in Russia, and the other is a specific lab on the campus at the Centers for Disease Control and Prevention in Atlanta. So these vials shouldn’t have been in this cold storage room at NIH.

What was also concerning was how they responded to it when they found these vials. Ultimately, it was one scientist, by themselves, who basically picked up the cardboard box and walked it down the corridors of this building at the NIH and across the street and into another building. All the while, they’re hearing this clink, clink of these fragile old vials hitting each other as they’re walking along.

The FBI report that I read of the incident criticised the scientist and just the whole handling of this box, because when it was properly catalogued, in the end, there was a vial that had broken inside this box — and once again, the world got lucky, and it was not smallpox virus, it was some sort of a tissue sample. But as the FBI report noted, had that been the freeze-dried smallpox specimen, there was nothing really protecting the person who was carrying it.

You would hope that everyone who is working around really very dangerous pathogens like smallpox, which should not get into other people’s hands, part of the concern that was raised is you shouldn’t necessarily have one single person by themselves carrying a box that contains smallpox virus.

Luisa Rodriguez: And that’s because it’s one of the few pathogens that a single person could use as a bioweapon, basically?

Alison Young: Correct.

The Soviet anthrax leak

Alison Young: There was an accident at a lab that was believed to actually be a bioweapons facility by US intelligence, but a lab nonetheless, that was working with large quantities of anthrax. It appears that it spewed a giant plume of anthrax spores over a town. And people downwind were sickened, animals were killed, about 60 people in that case died. Initially, the authorities sought to claim that there was no airborne anthrax — that this was ultimately a result of anthrax food poisoning, possibly from black market meat or some sort of contaminated cattle feed or agricultural feed. And that was sort of where it was.

Over time, because it was such a huge and deadly outbreak, there was intense scientific community interest. And eventually, there was a group of scientists who invited officials from these former Soviet communities to come to the United States and give a presentation at the US National Academies of Sciences. There, they produced all kinds of slides and charts and told compelling stories of racing up into the mountains and how they were there to help save these people, and they showed all kinds of information that really was making the case that this was a foodborne anthrax outbreak. Coming out of that meeting, there are news clippings in The Washington Post and The New York Times and elsewhere where prominent US scientists say they’ve been incredibly transparent and they’ve made quite the case — it looks like this really was gastrointestinal anthrax, and not some sort of an airborne release.

Then it took many more years, until 1992, when then Russian President Boris Yeltsin came out and made this very surprising statement in a Russian newspaper that in fact that outbreak was the result of a military lab accident.

Luisa Rodriguez: So this case absolutely shocks me. One, it’s just horrific: 60 people died. Two, there was this extremely successful coverup by the Soviets, which was particularly because they were violating the Biological Weapons Convention and wanted to hide that. And then three, just bizarrely, Boris Yelstin later unprompted admitted that this was caused by military bioweapons research.

But I wanted to talk about what happened after all of that, which was this joint effort by American and Russian scientists to find out exactly what happened. I just found this extremely moving. Can you explain what they did?

Alison Young: Yeah, it’s fascinating. Here were these Russian scientists who, at the time all of this occurred, were incredibly brave and basically hid away evidence to keep the KGB from taking it away. So they hid away their notes. They had samples from the people who died, and they kept them in jars — but they put them out in the open, almost hiding them in plain sight, so that they wouldn’t be confiscated. They had kept these for all of these years, and so, as the political situation changed in Russia, it became possible for them to actually disclose that they had this information.

And they did some remarkable investigations, where they even went and looked at other records that weren’t destroyed, such as who got compensated. They went to graveyards and looked at the death records. And ultimately, even some of the main US scientists who were the biggest proponents that this was not some sort of a bioweapons lab and that it was absolutely what the Soviet officials had said, and they were absolutely believing of this initial cover story that this was a meat problem, those same scientists ultimately came around — some of them assisting with the Russians’ research that this was a huge anthrax plume, and there was plenty of documentation for it.

And I think the thing that is so instructive is it took 15 years to get to that point from when the accident happened, and all of the years of coverup, and all of the years of many international scientists believing the cover story, to ultimately getting to the truth.

A culture of martyrdom

Alison Young: One of the challenges is the idea of establishing safety culture within organisations. Part of my book goes way back into the history of biological safety, and I spent a lot of time reading the papers of a man by the name of Dr Arnold Wedum, who is considered the father of modern biosafety. And part of the reason the book goes into depth about Arnold Wedum’s findings is that I think many of his concerns about the lack of safety culture in microbiology, and the difficulty in getting certain scientists to accept the importance of following safety protocols, some of this resistance to safety culture that he saw way back in the 1950s are some of the same kinds of things that play out today in these incidents.

Arnold Wedum talked quite a lot about this idea of being a martyr to science. Obviously, the people who went into microbiology over time are people who are very dedicated to the study of science, to trying to improve the lives of people around the planet.

One of the things that’s important to remember is that microbiology is relatively a new science. It’s a young science compared to chemistry and radiological sciences, and Dr Wedum said that those scientists seem to be much more open to the scrutiny of their practices than those working in microbiology labs — who, for much of the history of microbiology, because there were not ways to keep them safe, were often catching their experiments. Dr Wedum also talked about how — again, this is back many years ago — some of these scientists took great pride in how many times they had become infected, because they were doing this for the greater good.

Luisa Rodriguez: I remember finding it striking in the book, reading about these cases where scientists, working before a bunch of better safety practices, would basically brag, as you said — like, “I’ve gotten TB four times already” — and it was almost a battle scar that they wore with pride.

Maybe there, the takeaway is this field is coming from this initial foundation of getting these diseases is a norm and is even kind of a good thing. It’s like a badge of honour. So when you try to throw all these safety practices on top, they’re resistant because they’re used to this; they don’t regard it as a terrible thing. And that’s part of what’s made making safety a norm a much harder problem. Does that sound right?

Alison Young: Some of that, I think, is very much the case. Also, there’s just not a culture of tracking these kinds of infections. There never has been a culture of that. To this day, there are no universal tracking systems for these kinds of illnesses in labs or accidents.

I think part of the challenge as well is that nobody likes having to do things that make it harder to do your job. And one of the realities of the kinds of safety procedures and equipment that are required, depending on the pathogen, they can make doing your work slower and more cumbersome. It can be more expensive. There may be limited access to certain kinds of equipment. All of those kinds of things — at least over time, in what Dr Wedum wrote about — created a culture where there were questions about whether any of it was necessary.

And that’s where that idea of the “martyr to science” culture comes from. So that was back in the ’50s, ’60s, and ’70s, when Dr Wedum was really writing about those kinds of things. Here we are in 2023: What is the culture inside individual labs? It’s hard to say, but you can see in incident after incident that there are individuals and institutions that are not paying the attention to safety that they should be.

No one wants to regulate biolabs

Alison Young: This is a topic I’ve been now covering for 15 years, and it’s important to know that going back at least 10 years ago, the US Government Accountability Office started issuing reports raising concern that as more of these kinds of biological research facilities are built and doing more experiments with more risky pathogens, there is this increase in the aggregate risk of a catastrophic accident. So I’ve been covering hearings in Congress going back over time, and back then, there was not one political party or another that was interested in this: this was a bipartisan concern.

And as I wrote Pandora’s Gamble, it was a huge reminder as I went back and read through some of the transcripts of hearings that I’d sent in as a reporter, and seeing both Democrats and Republicans asking really important questions about the policy issues of how we deal with the safety of these labs. There was a recognition of the importance of conducting biological research. I mean, we all need this — I don’t want lost in any of this the idea that this world has benefited greatly from the COVID-19 vaccines and from all kinds of work that these labs do. But we also need that work to be done safely. And how many labs do we actually need?

And Congress was holding hearings and looking at this stuff closely. There were pushes in the 2014–2015 timeframe — when I was writing about a bunch of accidents at the Centers for Disease Control and Prevention and at Dugway, as we’ve discussed — there were even more hearings raising questions of did there need to be a single federal entity that was overseeing lab safety? And then it went nowhere. And that has played out over and over, over the years.

Part of it is that the organisations that operate labs, nobody wants more regulation on them: nobody wants more scrutiny, nobody wants more red tape. And the federal agencies that Congress and the public rely on to advise on what we need to do in these arenas all have potential conflicts of interests. The agencies like the National Institutes of Health: it’s one of the largest funders of biomedical research in the world. They conduct their own research; they are funding the research often at the labs that are having the accidents that are of concern. You have the Centers for Disease Control and Prevention: they are one of the two primary regulators in the limited subset of these labs that are actually subject to any regulation on safety. The CDC’s labs have their own series of issues with safety problems in their labs.

So it is something that every few years, at least in my coverage of it, you see interest in Congress and then it dies back down again. And now with COVID-19, obviously this is back in Congress and being discussed again, but the whole political climate in Washington has become so toxic that that is now adding a new layer to the whole debate.

Nobody is tracking how many biosafety level 3 labs there are

Alison Young: One of the things that just is so frustrating in this arena is that nobody is even tracking how many of these labs there are. One of the biggest surprises for me when I started covering this is that the US Government Accountability Office, which is the nonpartisan investigative arm of Congress, produced reports going back more than a decade ago that said even the US government doesn’t know how many biosafety level 3 labs there are.

Part of the issue here is that it is such a fragmented area. If you are a privately funded lab, and you’re not taking government money and you are not working with a select agent pathogen, the government may not really know that you exist as a lab. They may know piecemeal — like, you might have to have workers’ compensation, or you might have to have some OSHA things, or you might have to have a wastewater permit. But you don’t have a lab permit, and so there’s no chronicling of where all these labs are.

So one of the things we did when I was a reporter on USA Today‘s national investigative team is we set out to find out how many biosafety level 3 labs can we even identify. And it was incredibly difficult. We identified a couple hundred of these labs across the country, but what it took to do that is literally googling “biosafety level 3 lab” and then we could find where places advertised it. Or we looked at government grant records where they mentioned that they were using a biosafety level 3 lab or a BSL-3 lab. Or we looked at LinkedIn and looked where people promoted the fact that they’d worked in these labs. But this is cobbling it together from an incredible number of records that it’s something that you would think that the government would know.

And that’s just in the United States. I have a Google alert that is set up for BSL-3 and BSL-4 labs, so I see the press releases that go out when various countries or various universities are announcing that they’re building a BSL-3 or a BSL-4 lab. But there is no one place that policymakers or the public can go to see where these labs are, or how many there are.

Luisa Rodriguez: How can we not be tracking those?

Alison Young: There just is no mechanism. There’s a case right now that has gotten some recent attention out in California, where there was a biotechnology lab in Reedley, California, that has gotten some attention because literally a code enforcement officer in this small city discovered that there was this lab, and they had 1,000 mice, and they had -80° freezers out there, they had all sorts of biological materials. And ultimately, and I’ve been working on some reporting in this area, what the local officials have said is that the only way they were able to address this lab — because it was privately funded, it didn’t receive any government grant money, and they weren’t obviously working with any select agent pathogens — they had to cobble together and use local code enforcement and other piecemeal regulations in order to address the facility. There was no lab authority to go to to address the biohazards of the facility.

And this issue has come up over and over over the years, but it’s not one that policymakers have so far addressed. There has been a lot of talk, and it has been known for a long time that there are gaping holes in the oversight because of the fragmented nature of how we look at these biolabs.

There’s one other aspect of the proposed legislation that is worth pointing out: It does include a provision that asks for a biosecurity board in the US government to evaluate the effectiveness of the current Federal Select Agent Program in overseeing biorisks in this country. And it asks for proposals to, in its words, “harmonize” the various fragmented pieces — whether it’s at the NIH; the NIH Guidelines; the Select Agent Program; and the recommendations in something called the BMBL, basically the biosafety manual, its recommendations (but not regulations) of safety practices. But what’s interesting in how it is written is it sounds like harmonising, but leaving in place the fragmented system of multiple agencies being responsible for this kind of work.

Comments


No comments on this post yet.
Be the first to respond.
Curated and popular this week
Paul Present
 ·  · 28m read
 · 
Note: I am not a malaria expert. This is my best-faith attempt at answering a question that was bothering me, but this field is a large and complex field, and I’ve almost certainly misunderstood something somewhere along the way. Summary While the world made incredible progress in reducing malaria cases from 2000 to 2015, the past 10 years have seen malaria cases stop declining and start rising. I investigated potential reasons behind this increase through reading the existing literature and looking at publicly available data, and I identified three key factors explaining the rise: 1. Population Growth: Africa's population has increased by approximately 75% since 2000. This alone explains most of the increase in absolute case numbers, while cases per capita have remained relatively flat since 2015. 2. Stagnant Funding: After rapid growth starting in 2000, funding for malaria prevention plateaued around 2010. 3. Insecticide Resistance: Mosquitoes have become increasingly resistant to the insecticides used in bednets over the past 20 years. This has made older models of bednets less effective, although they still have some effect. Newer models of bednets developed in response to insecticide resistance are more effective but still not widely deployed.  I very crudely estimate that without any of these factors, there would be 55% fewer malaria cases in the world than what we see today. I think all three of these factors are roughly equally important in explaining the difference.  Alternative explanations like removal of PFAS, climate change, or invasive mosquito species don't appear to be major contributors.  Overall this investigation made me more convinced that bednets are an effective global health intervention.  Introduction In 2015, malaria rates were down, and EAs were celebrating. Giving What We Can posted this incredible gif showing the decrease in malaria cases across Africa since 2000: Giving What We Can said that > The reduction in malaria has be
LintzA
 ·  · 15m read
 · 
Cross-posted to Lesswrong Introduction Several developments over the past few months should cause you to re-evaluate what you are doing. These include: 1. Updates toward short timelines 2. The Trump presidency 3. The o1 (inference-time compute scaling) paradigm 4. Deepseek 5. Stargate/AI datacenter spending 6. Increased internal deployment 7. Absence of AI x-risk/safety considerations in mainstream AI discourse Taken together, these are enough to render many existing AI governance strategies obsolete (and probably some technical safety strategies too). There's a good chance we're entering crunch time and that should absolutely affect your theory of change and what you plan to work on. In this piece I try to give a quick summary of these developments and think through the broader implications these have for AI safety. At the end of the piece I give some quick initial thoughts on how these developments affect what safety-concerned folks should be prioritizing. These are early days and I expect many of my takes will shift, look forward to discussing in the comments!  Implications of recent developments Updates toward short timelines There’s general agreement that timelines are likely to be far shorter than most expected. Both Sam Altman and Dario Amodei have recently said they expect AGI within the next 3 years. Anecdotally, nearly everyone I know or have heard of who was expecting longer timelines has updated significantly toward short timelines (<5 years). E.g. Ajeya’s median estimate is that 99% of fully-remote jobs will be automatable in roughly 6-8 years, 5+ years earlier than her 2023 estimate. On a quick look, prediction markets seem to have shifted to short timelines (e.g. Metaculus[1] & Manifold appear to have roughly 2030 median timelines to AGI, though haven’t moved dramatically in recent months). We’ve consistently seen performance on benchmarks far exceed what most predicted. Most recently, Epoch was surprised to see OpenAI’s o3 model achi
Rory Fenton
 ·  · 6m read
 · 
Cross-posted from my blog. Contrary to my carefully crafted brand as a weak nerd, I go to a local CrossFit gym a few times a week. Every year, the gym raises funds for a scholarship for teens from lower-income families to attend their summer camp program. I don’t know how many Crossfit-interested low-income teens there are in my small town, but I’ll guess there are perhaps 2 of them who would benefit from the scholarship. After all, CrossFit is pretty niche, and the town is small. Helping youngsters get swole in the Pacific Northwest is not exactly as cost-effective as preventing malaria in Malawi. But I notice I feel drawn to supporting the scholarship anyway. Every time it pops in my head I think, “My money could fully solve this problem”. The camp only costs a few hundred dollars per kid and if there are just 2 kids who need support, I could give $500 and there would no longer be teenagers in my town who want to go to a CrossFit summer camp but can’t. Thanks to me, the hero, this problem would be entirely solved. 100%. That is not how most nonprofit work feels to me. You are only ever making small dents in important problems I want to work on big problems. Global poverty. Malaria. Everyone not suddenly dying. But if I’m honest, what I really want is to solve those problems. Me, personally, solve them. This is a continued source of frustration and sadness because I absolutely cannot solve those problems. Consider what else my $500 CrossFit scholarship might do: * I want to save lives, and USAID suddenly stops giving $7 billion a year to PEPFAR. So I give $500 to the Rapid Response Fund. My donation solves 0.000001% of the problem and I feel like I have failed. * I want to solve climate change, and getting to net zero will require stopping or removing emissions of 1,500 billion tons of carbon dioxide. I give $500 to a policy nonprofit that reduces emissions, in expectation, by 50 tons. My donation solves 0.000000003% of the problem and I feel like I have f