[ Question ]

[Coronavirus] Is it a good idea to meet people indoors if everyone's rapid antigen test came back negative?

by Jonas Vollmer2 min read24th Mar 202110 comments

34

COVID-19 pandemic
Personal Blog

Does anyone know how risky it is to meet someone indoors without mask if both take a rapid antigen Covid test and it comes back negative?

Rapid antigen tests are available in retail in some countries and tend to be pretty cheap (6€ per test, sometimes free). They seem easily worth the cost if it means one can hang out in person indoors.

My current thinking:

  • The antigen test I looked at reports a sensitivity of 92% (95% CI: 83.63%-96.28%) and specificity of 99.26% (95% CI: 95.92%-99.87%)
  • I take the point estimates at face value (don't really see a reason not to – but perhaps that's wrong?). The point estimates imply an odds ratio of 12 – if the prior (e.g. adjusted prevalence from microCOVID) is 0.4%, the posterior probability of that person having Covid is 0.03%.
  • I do another arbitrary 1.5x adjustment because they don't have symptoms, so the posterior probability goes down to 0.02%.
  • I assume (as per microCOVID) that the risk of an unmasked indoor hangout is 9% per hour, so 18% for 2h.
  • So meeting someone who tested negative indoors without a mask implies 18%*.02% = 36 microCOVIDs, which is very low.

I would find it really useful to have an answer to this question, so I'm happy to reward good responses with $100 of personal money (up to $300 total).

 

Ways in which this could be wrong:

  • Perhaps some rapid antigen Covid tests are much more reliable than others. How much attention should I pay to the type of test?
    • People seem much more willing to take tests with a nasal swab that only goes 3cm deep rather than 10cm, as the latter tends to be painful for hours, whereas the former is not a big issue. But perhaps the more painful tests are better?
    • Are there differences between self-administered tests and tests taken at a testing center? The former are obviously much easier to take.
  • Maybe we should be worried about not being able to administer the test properly (e.g. if taking it outdoors, temperature might be too low?)
  • Maybe there are reasons to think sensitivity/specificity are overestimates (like given that only some tests make it to market, perhaps there's some optimizer's curse issue going on?)
  • Maybe the test tells you whether someone has Covid at the time of taking the test according to PCR, but they might become positive over the, say, 2h of meeting them, and maybe this issue is significant.
  • Someone suggested that tests tend to have higher sensitivity for highly infectious people, so the officially reported sensitivity might actually be too low for our purposes.

34

New Answer
Ask Related Question
New Comment

2 Answers

[Epistemic status: This is mostly hobbyist research that I did to evaluate which tests to buy for myself]

The numbers listed by the manufacturers are not very useful, sadly. These are generally provided without a standard protocol or independent evaluation, and can be assumed to be a best case scenario in a sample of symptomatic individuals. On the other hand, as you note, the sensitivity of antigen tests increases when infectiousness is high.

 I am absolutely out of depth trying to balance these two factors, but luckily an empirical study from the UK  estimates based on contact tracing data that "The most and least sensitive LFDs [a type of rapid antigen tests used in the UK] would detect 90.5% (95%CI 90.1-90.8%) and 83.7% (83.2-84.1%) of cases with PCR-positive contacts respectively." So, if a person tests negative but is still Covid-19 positive, you can assume the likelihood of infection to be 10-20% of an average Covid-19 contact.  

With regards to self vs. professional testing, there does not seem to be a very clear picture yet, but this German study suggests basically equivalent sensitivity.    

You should also make sure to buy tests that were independently evaluated, you can find lists of such tests here or here. The listed numbers are hard to compare between different studies and tests, however, but the one you mentioned seems to have good results compared to other tests

I am honestly not sure how long the test results are valid, but 2 hours seems safe. I cannot comment on the other numbers provided by microCovid. 

PCR tests themselves aren't that sensitive, as mentioned by atlas. I've seen estimates (1, 2) of ~20% false negative. Counterbalancing that is that I assume people who have so little virus in their nose + throat to avoid a PCR test are at lower risk of spreading. But I would (very subjectively) record my microcovids as .3 x raw numbers.

Thanks, this is helpful! Feel free to PM me your payment details so I can send you the $100 reward mentioned in the post.

[Another hobbyist here]

I agree with Tsunayoshi's answer.

Another thing to keep in mind that even the best studies on rapid antigen tests usually compare against PCR tests; that is, if they agree with PCR tests in all cases, the sensitivity is reported as 100%. However, the sensitivity of PCR tests is (as far as I can tell) not 100%, and can vary a lot based on factors such as how the sample is collected and transported.

Here's an article on the issue. Key quote:

Whether a SARS-CoV-2 test detects clinical disease depends on biologic factors, pre-analytic factors, and analytic performance. Someone with a large amount of virus in their nose/throat will have a positive test with a nose/throat swab. However, someone with little to no virus in their nose or throat may have a negative test even if they have virus somewhere else (like the lungs). [...] If no virus is present at the site of collection, the collection fails to get virus in the sample, or the sample is severely degraded from storage or transport (for example baking in the sun on a car dash) then the test will be negative no matter how sensitive the test is.

Then there's studies like Kucirka et al, which is summarized in a later paper via this graph of false negative rates in PCR tests:

The study concludes

If clinical suspicion is high, infection should not be ruled out on the basis of RT-PCR alone, and the clinical and epidemiologic situation should be carefully considered.

I don't know how trustworthy the Kucirka et al study is, since the false negative rates reported are a lot worse than any I've seen elsewhere. But I think the upshot is that even "gold-standard" PCR testing is messy, and we shouldn't trust studies that estimate antigen-test sensitivity by comparison to PCR (or at least adjust for low PCR sensitivity).

A different conclusion that I think is reasonable is that RT-PCR tests are a good baseline given competent administration and possibly re-testing. I don't know enough about the mechanics of testing to evaluate whether a given study does well on this or not. 

6 comments, sorted by Highlighting new comments since Today at 6:34 AM

This tweet (in German) seems relevant.

And here's a related anecdote: This story might just be a fluke, but it does suggest that it can happen that people test negative repeatedly shortly before superspreading.

  • Someone suggested that tests tend to have higher sensitivity for highly infectious people, so the officially reported prevalence might actually be too low for our purposes.

In the 2nd part of the sentence, did you mean to say "sensitivity" rather than "prevalence"?

Oops, yes, edited.

  • So meeting someone who tested negative indoors without a mask implies 18%*.02% = 36 microCOVIDs, which is very low.

I'd also be curious to know better estimates to this - have you updated your estimate after these two answers? Or did anyone else update their estimates?

(if you have time to reply, no worries if not)

Sensitivity is relative to PCR tests, and tends to be reported quite incorrectly. So unlike I suggested in the OP, I think the adjustment should probably be ~4x and not ~12x.

[+][comment deleted]4mo 1