Update: Most information presented here is out of date. See the 80,000 hours page for more up-to-date information.
I have been researching the Wuhan Coronavirus for several hours today, and I have come to the tentative conclusion that the situation is worse than I initially thought.
Given my current understanding, it now seems reasonable to assign a non-negligible probability (>2%) to the proposition that the current outbreak will result in a global disaster (>50 million deaths resulting from the pathogen within 1 year). I understand this prediction will sound alarmist, but in this post I will outline some of the reasons why I have come to this conclusion.
I now believe that it is warranted for effective altruists to take particular actions to prepare for a resulting pandemic. The most effective action is likely to research preparation in order to limit exposure to sources of the virus. Sending out evidence-based warning signals to at-risk communities may also be effective at limiting the spread of the pathogen.
Summary of my reasons for believing that this outbreak could result in a global disaster
- The current outbreak matches the criteria that scientists have identified as being particularly likely characteristics of a pandemic-induced global disaster. That is, it’s a disease that’s contagious during a long incubation period, has a high infection rate, has no known treatment, few people are immune, and it has a low but significant mortality rate. See this article for a summary of likely characteristics of a pandemic-induced global disaster.
- Based on my research, I wasn't able to identify any historically recent pathogen with these characteristics, giving me reason to believe that using an outside view to argue against alarmism may not be warranted. For reference, the 2003 SARS outbreak, the 2009 Swine Flu, and the several Ebola outbreaks do not match the profiles of a global disaster as completely as the current outbreak.
- Estimates of the mortality rate vary, but one media source says, "While the single figures of deaths in early January seemed reassuring, the death toll has now climbed to above 3 percent." This would put it roughly on par with the mortality rate of the 1918 flu pandemic, and over 10 times more deadly than a normal seasonal flu. It’s worth noting, however, that the 1918 flu pandemic killed mostly young adults, whereas the pattern for this pathogen appears to be the opposite (which is normal for pathogens).
- The incubation period (the period during which symptoms are not present but those infected can still infect others) could be as long as 14 days, according to many sources.
- An Imperial College London report stated, "Self-sustaining human-to-human transmission of the novel coronavirus (2019-nCov) is the only plausible explanation of the scale of the outbreak in Wuhan. We estimate that, on average, each case infected 2.6 (uncertainty range: 1.5-3.5) other people up to 18th January 2020, based on an analysis combining our past estimates of the size of the outbreak in Wuhan with computational modelling of potential epidemic trajectories. This implies that control measures need to block well over 60% of transmission to be effective in controlling the outbreak."
- Compare the above infection rate to the H1N1 virus, which some estimate to have infected 10-20% of the world population in 2009. The World Health Organization has said, "The pandemic (H1N1) 2009 influenza virus has a R0 of 1.2 to 1.6 (Fraser, 2009) which makes controlling its spread easier than viruses with higher transmissibility."
- A simple regression model indicates that the growth rate of the pathogen is predictable and extremely rapid.
- The number of cases as reported by the National Health Commission of China forms the basis of my regression model (you can currently find the number of cases reported in graphical format on the Wikipedia page here). An exponential regression model fit to the data reveals that the equation 38.7 * e^(0.389 * (t+11)) strongly retrodicts the number of cases (where t is the number of days since January 26th). In this model, the growth is very high.
- [Update: Growth for January 27th remained roughly in line with the predicted growth from the exponential regression model. The new equation is 35.5*exp(0.401*t) where t is the number of days since January 15th]
- A top expert has estimated that approximately 100,000 people have already been infected, which is much more than the confirmed number of 2808 (as of January 26th). If the number were this high, then the pathogen has likely already crossed the quarantine. The infection has also spread to 12 other countries besides China, supporting this point.
- The Metaculus community’s estimate for the number of total cases in 2020 is much higher than it was just two or three days ago. Compare this older question here, versus this new question (when it opens).
- While several organizations are developing a vaccine, Wikipedia seems to indicate that it will take months before vaccines even enter trials, and we should expect that it will take about a year before a vaccine comes out.
Summary of my recommendations
I think it's unlikely that EAs are in any special position to help stop the pandemic. However, we can guard ourselves against the pandemic by heeding early warnings, research ways to limit our exposure to the virus, and use our platforms to warn those at-risk.
The CDC has a page for preparing for disaster.
Currently, the pathogen appears to have a significant mortality rate, but kills mainly older people. Therefore, old people are most at-risk of dying.
Even if you contract the disease and don't die, the symptoms are likely to be severe. One source says,
ARDS (acute respiratory distress syndrome) is a common complication. Between 25 and 32 percent of cases are admitted to the intensive care unit (ICU) for mechanical ventilation and sometimes ECMO (pumping blood through an artificial lung for oxygenation).
Other complications include septic shock, acute kidney injury, and virus-induced cardiac injury. The extensive lung damage also sets the lung up for secondary bacterial pneumonia, which occurs in 10 percent of ICU admissions.
Acknowledgements: Dony Christie and Louis Francini helped gather sources and write this post.
This straightforwardly got the novel coronavirus (now "covid-19") on the radar of many EAs who were otherwise only vaguely aware of it, or thought it was another media panic, like bird flu.
The post also illustrates some of the key strengths and interests of effective altruism, like quantification, forecasting, and ability to separate out minor global events from bigger ones.
For a long time, I've believed in the importance of not being alarmist. My immediate reaction to almost anybody who warns me of impending doom is: "I doubt it". And sometimes, "Do you want to bet?"
So, writing this post was a very difficult thing for me to do. On an object-level, l realized that the evidence coming out of Wuhan looked very concerning. The more I looked into it, the more I thought, "This really seems like something someone should be ringing the alarm bells about." But for a while, very few people were predicting anything big on respectable forums (Travis Fisher, on Metaculus, being an exception), so I stayed silent.
At some point, the evidence became overwhelming. It seemed very clear that this virus wasn't going to be contained, and it was going to go global. I credit Dony Christie and Louis Francini with interrupting me from my dogmatic slumber. They were able to convince me —in the vein of Eliezer Yudkowsky's Inadequate Equilibria —that the reason why no one was talking about this probably had nothing to do whatsoever with the actual evidence. It wasn't that people had a model and used that model to predict "no doom" with high confidence: it was a case of peo... (read more)
Note: The relevant Metaculus for this forecast also has currently ~2% odds on this level of catastrophe.
This is the boring take, but it's worth noting that conditional on this spreading widely, perhaps the most important things to do are mitigating health impacts on you, not preventing transmission. And that means staying healthy in general, perhaps especially regarding cardiovascular health - a good investment regardless of the disease, but worth re-highlighting.
I'm not a doctor, but I do work in public health. Based on my understanding of the issues involved, if you want to take actions now to minimize severity later if infected, my recommendations are:
And for preventing transmission, I know it seems obvious, but you need to actually wash your hands. Also, it seems weird, by studies indicate that brushing teeth seems to help reduce infection rates.
And covering your mouth with a breathing mask may be helpful, as long as you're not, say, touching food with your hands that haven't been washed recently and then eating. Also, even if there is no Coronavirus, in general, wash your hands before eating. Very few people are good about doing this, but it will help.
Thanks for this. I found this article on how to personally prevent its spread helpful: https://foreignpolicy.com/2020/01/25/wuhan-coronavirus-safety-china/
I'm willing to bet up to $100 at even odds that by the end of 2020, the confirmed death toll by the Wuhan Coronavirus (2019-nCoV) will not be over 10,000. Is anyone willing to take the bet?
Incubation period and Chinese government coverup efforts are relevant to this question, but roughly speaking if the actual number of infections is ~35x the reported number, and there's no uptick in mysterious deaths in hospitals, then the actual mortality rate is ~1/35 the reported number, more in line with normal flu than 1918 Spanish flu.
Current death rates are likely to underestimate the total mortality rate, since the disease has likely not begun to affect most of the people who are infected.
I'll add information about incubation period to the post.
It should be noted that the oft-cited case-fatality ratio of 2.5% for the 1918 flu might be inaccurate, and the true CFR could be closer to 10%: https://rybicki.blog/2018/04/11/1918-influenza-pandemic-case-fatality-rate/?fbclid=IwAR3SYYuiERormJxeFZ5Mx2X_00QRP9xkdBktfmzJmc8KR-iqpbK8tGlNqtQ
EDIT: Also see this twitter thread: https://twitter.com/ferrisjabr/status/1232052631826100224
Howie and I just recorded a 1h15m conversation going through what we do and don't know about nCoV for the 80,000 Hours Podcast.
We've also compiled a bunch of links to the best resources on the topic that we're aware of which you can get on this page.
https://www.worldometers.info/coronavirus/
I find the analysis from this link very interesting. It suggests that Ro is higher than initially estimated at 3-4 (rather than 1.4-2.5 by WHO) but the national China mortality rate drops to 0.3% if the province of Hubei is excluded (the reported mortality rate of Wuhan alone is 5.5%). This would be consistent with the theory that the number of cases are underreported in Wuhan, due to a shortage of testing capacity and perhaps under reporting. A recent Lancet report by Professor Gabriel Leung from University of Hong Kong https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(20)30260-9/fulltext estimates 76,000 cases in Wuhan as of Jan 25th based on a Ro of 2.68, more than 30x reported, which would put the mortality rate at well under 0.5%.
This suggests the pandemic could be more difficult to control than expected but mortality rate is also much lower (perhaps in the region of 3x flu).
This may also mean the main damage could be through economic impact.
I am an EA living in China right now. Thanks for sharing this post on the Coronavirus. I am also very interested in these questions.
We do not know what percentage of people experience symptoms so mild that they do not seek medical attention and so do not appear in the 'Suspected' or 'Confirmed' case statistics.
here: https://ncov.dxy.cn/ncovh5/view/pneumonia
and here: https://www.worldometers.info/coronavirus/
Here’s Google Translate if you need it: https://translate.google.com/
Nevertheless, as you mentioned, attempts have been made to model the spread of the infection and to estimate the number of people carrying the virus so far.
I have created a very simple spreadsheet with three scenarios here:
https://docs.google.com/spreadsheets/d/1qSNLQC5BpA-Gah0INyolFpPvTAGFphKUwTxQZN7FWI4/edit?usp=sharing
The Red scenario = 50% of infections go undiagnosed (unrecorded).
The Yellow scenario = 70% of infections go undiagnosed (unrecorded).
The Green scenario = 85% of infections go undiagnosed (unrecorded).
Each scenario has different estimates of the ‘real’ number of infections and percentages that progress to either a serious/critical condition or to de... (read more)
Thanks for the article. One thing I'm wondering about that has implications for the large scale pandemic case is how much equipment for "mechanical ventilation and sometimes ECMO (pumping blood through an artificial lung for oxygenation)" does society have and what are the consequences of not having access to such equipment? Would such people die? In that case the fatality rate would grow massively to something like 25 to 32%.
Whether there is enough equipment would depend upon how many get sick at once, can more than one person use the same equipment in an interleaved fashion, how long each sick person needs the equipment, are their good alternatives to the equipment, and how quickly additional equipment could be built or improvised.
So the case I'd be worried about here would be a very quick spread where you need rare expensive equipment to keep the fatality rate down where it is currently.
A study published today attempts to estimate the nCoV incubation period:
As the authors note, this estimate indicates an incubation period remarkably similar to that of the Middle East respiratory syndrome.
It's now been more than two weeks since infected people have been diagnosed in countries like Thailand, but there's no outbreak in Thailand. There have so far been 14 cases in Thailand, all brought directly from China rather than from person-to-person infection in Thailand.
That makes me feel more skeptical that this will become a worldwide pandemic.
FYI - study of outcomes a/o Jan 25 for all 99 2019-nCoV patients admitted to a hospital in Wuhan between Jan 1 and Jan 20.
Many caveats apply. Only includes confirmed cases, not suspected ones. People who end up at a hospital are selected for being more severely ill. 60% of the patients have not yet been discharged so haven't experienced the full progression of the disease. Etc.
https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(20)30211-7/fulltext#%20
I wonder what sort of Fermi calculation we should apply to this? My quick (quite possibly wrong) numbers are:
=> P(death of a randomly selected person from it) = ~1/300
What are your thoughts?
Updating the Fermi calculation somewhat:
=> P(death of a randomly selected person from it) = ~1/67
I'm not entirely sure what to think of the numbers; I cannot deny the logic but it's pretty grim and I hope I'm missing some critical details, my intuitions are wrong, or unknown unknowns make things more favorable.
Hopefully future updates and information resolves some of the uncertainties here and makes the numbers less grim. One large uncertainty is how the virus will evolve in time.
Hmm. interesting. This goes strongly against my intuitions. In case of interest I'd be happy to give you 5:1 odds that this Fermi estimate is at least an order of magnitude too severe (for a small stake of up to £500 on my end, £100 on yours). Resolved in your favour if 1 year from now the fatalities are >1/670 (or 11.6M based on current world population); in my favour if <1/670.
(Happy to discuss/modify/clarify terms of above.)
Edit: We have since amended the terms to 10:1 (50GBP of Justin's to 500GBP of mine).
Hmm... I will take you up on a bet at those odds and with those resolution criteria. Let's make it 50 GBP of mine vs 250 GBP of yours. Agreed?
I hope you win the bet!
(note: I generally think it is good for the group epistemic process for people to take bets on their beliefs but am not entirely certain about that.)
Agreed, thank you Justin. (I also hope I win the bet, and not for the money - while it is good to consider the possibility of the most severe plausible outcomes rigorously and soberly, it would be terrible if it came about in reality). Bet resolves 28 January 2021. (though if it's within an order of magnitude of the win criterion, and there is uncertainty re: fatalities, I'm happy to reserve final decision for 2 further years until rigorous analysis done - e.g. see swine flu epidemiology studies which updated fatalities upwards significantly several years after the outbreak).
To anyone else reading. I'm happy to provide up to a £250 GBP stake against up to £50 of yours, if you want to take the same side as Justin.
The bet is on.
Strong kudos for betting. Your estimates seem quite off to me but I really admire you putting them to the test. I hope, for the sake of the world, that you are wrong.
Though it's interesting to note Justin's fermi is not far off how one of Johns Hopkins' CHS scenarios played out (coronavirus, animal origin, 65m deaths worldwide).
http://www.centerforhealthsecurity.org/event201/scenario.html
Note: this was NOT a prediction (and had some key differences including higher mortality associated with their hypothetical virus, and significant international containment failure beyond that seen to date with nCov)
http://www.centerforhealthsecurity.org/newsroom/center-news/2020-01-24-Statement-of-Clarification-Event201.html
Hmm. You're betting based on whether the fatalities exceed the mean of Justin's implied prior, but the prior is really heavy-tailed, so it's not actually clear that your bet is positive EV for him. (e.g., "1:1 odds that you're off by an order of magnitude" would be a terrible bet for Justion because he has 2/3 credence that there will be no pandemic at all).
Justin's credence for P(a particular person gets it | it goes world scale pandemic) should also be heavy-tailed, since the spread of infections is a preferential attachment process. If (roughly, I think) the median of this distribution is 1/10 of the mean, then this bet is negative EV for Justin despite seeming generous.
In the future you could avoid this trickiness by writing a contract whose payoff is proportional to the number of deaths, rather than binary :)
I respect that you are putting money behind your estimates and get the idea behind it, but would recommend you to reconsider if you want to do this (publicly) in this context and maybe consider removing these comments. Not only because it looks quite bad from the outside, but also because I'm not sure it's appropriate on a forum about how to do good, especially if the virus should happen to kill a lot of people over the next year (also meaning that even more people would have lost someone to the virus). I personally found this quite morbid and I have a lot more context into EA culture than a random person reading this, e.g. I can guess that the primary motivation is not "making money" or "the feeling of winning and being right" - which would be quite inappropriate in this context -, but that might not be clear to others with less context.
(Maybe I'm also the only one having this reaction in which case it's probably not so problematic)
edit: I can understand if people just disagree with me because you think there's no harm done by such bets, but I'd be curious to hear from the people who down voted if in addition to that you think that comments like mine are harmful because of being bad for epistemic habits or something, so grateful to hear if someone thinks comments like these shouldn't be made!
I have downvoted this, here are my reasons:
Pretty straightforwardly, I think having correct beliefs about situations like this is exceptionally important, and maybe the central tenet this community is oriented around. Having a culture of betting on those beliefs is one of the primary ways in which we incentivize people to have accurate beliefs in situations like this.
I think doing so publicly is a major public good, and is helping many others think more sanely about this situation. I think the PR risk that comes with this is completely dwarfed by that consideration. I would be deeply saddened to see people avoid taking these bets publicly, since I benefit a lot from both seeing people's belief put the test this way, and I am confident many others are too.
Obviously, providing your personal perspective is fine, but I don't think I want to see more comments like this, and as such I downvoted it. I think a forum that had many comments like this would be a forum I would not want to participate in, and I expect it to directly discourage others from contributing in ways I think are really important and productive (for example, it seems to have caused Sean below to seriously con... (read more)
I emphatically object to this position (and agree with Chi's). As best as I can tell, Chi's comment is more accurate and better argued than this critique, and so the relative karma between the two dismays me.
I think it is fairly obvious that 'betting on how many people are going to die' looks ghoulish to commonsense morality. I think the articulation why this would be objectionable is only slightly less obvious: the party on the 'worse side' of the bet seems to be deliberately situating themselves to be rewarded as a consequence of the misery others suffer; there would also be suspicion about whether the person might try and contribute to the bad situation seeking a pay-off; and perhaps a sense one belittles the moral gravity of the situation by using it for prop betting.
Thus I'm confident if we ran some survey on confronting the 'person on the street' with the idea of people making this sort of bet, they would not think "wow, isn't it great they're willing to put their own money behind their convictions", but something much more adverse around "holding a sweepstake on how many die".
(I can't find an ... (read more)
I am confused. Both of these are environments in which people participate in something very similar to betting. In the first case they are competing pretty directly for internet points, and in the second they are competing for monetary prices.
Those two institutions strike me as great examples of the benefit of having a culture of betting like this, and also strike me as similarly likely to create offense in others.
We seem to agree on the value of those platforms, and both their public perception and their cultural effects seem highly analogous to the private betting case to me. You even explicitly say that you expect similar reactions to questions like the above being brought up on those platforms.
I agree with you that were there only the occasional one-off bet on the forum that was being critiqued here, the epistemic cost would be minor. But I am confident that a community that had a relationship to betting that was more analogous to how Chi's relationship to betting appears to be, we would have never actually built th... (read more)
This is directly counter to my experience of substantive and important EA conversation. All the topics I'm interested in are essentially morbid topics when viewed in passing by a 'person on the street'. Here are examples of such questions:
or kill <10,000 people with enough time for us to calibrate to the difficulty of the alignment problem, or will it be more sudden than that?
Like, sometimes I even just bet on ongoing death rates. Someone might say to me "The factory farming problem is very small of course" and I'll reply "I will take a bet with you, if you're so confident. You say what you think it is, ... (read more)
All of your examples seem much better than the index case I am arguing against. Commonsense morality attaches much less distaste to cases where those 'in peril' are not crisply identified (e.g. "how many will die in some pandemic in the future" is better than "how many will die in this particular outbreak", which is better than "will Alice, currently ill, live or die?"). It should also find bets on historical events are (essentially) fine, as whatever good or ill implicit in these has already occurred.
Of course, I agree they your examples would be construed as to some degree morbid. But my recommendation wasn't "refrain from betting in any question where we we can show the topic is to some degree morbid" (after all, betting on GDP of a given country could be construed this way, given its large downstream impacts on welfare). It was to refrain in those cases where it appears very distasteful and for which there's no sufficient justification. As it seems I'm not expressing this balancing consideration well, I'll belabour it.
#
Say, God forbid, one of my friend's children has a life-limiting disease. On its fac... (read more)
At least from a common-sense morality perspective, this doesn't sit right with me. I do feel that it would be wrong for two people to get together to bet about some horrible tragedy -- "How many people will die in this genocide?" "Will troubled person X kill themselves this year?" etc. -- purely because they thought it'd be fun to win a bet and make some money off a friend. I definitely wouldn't feel comfort
... (read more)Responding to this point separately: I am very confused by this statement. A large fraction of topics we are discussing within the EA community, are pretty directly about the death of thousands, often millions or billions, of other people. From biorisk (as discussed here), to global health and development, to the risk of major international conflict, a lot of topics we think about involve people forming models that will quite directly require forecasting the potential impacts of various life-or-death decisions.
I expect bets about a large number of Global Catastrophic Risks to be of great importance, and to similarly be perceived as "ghoulish" as you describe here. Maybe you are describing a distinction that is more complicated than I am currently comprehending, but I at least would expect Chi and Greg to object to bets of the type "what is the expected number of people dying in self-driving car... (read more)
There might also be a confusion about what the purpose and impact of bets in our community is. While the number of bets being made is relatively small, the effect of having a broader betting culture is quite major, at least in my experience of interacting with the community.
More precisely, we have a pretty concrete norm that if someone makes a prediction or a public forecast, then it is usually valid (with some exceptions) to offer a bet with equal or better odds than the forecasted probability to the person making the forecast, and expect them to take you up on the bet. If the person does not take you up on the bet, this usually comes with some loss of status and reputation, and is usually (correctly, I would argue) interpreted as evidence that the forecast was not meant sincerely, or the person is trying to avoid public accountability in some other way. From what I can tell, this is exactly what happened here.
The effects of this norm (at least as I have perceived it) are large and strongly positive. From what I can tell, it is one of the norms that ensures the consistency of the models that our public intellectuals express, and when I interact with communities that do not have t... (read more)
While my read of your post is "there is the possibility that the aim could be interpreted this way" which I regard as fair, I feel I should state that 'fun and money' was not my aim, and (I strongly expect not Justin's), as I have not yet done so explicitly.
I think it's important to be as well-calibrated as reasonably possible on events of global significance. In particular, I've been seeing a lot of what appear to me to be poorly calibrated, alarmist statements, claims and musings on nCOV on social media, including from EAs, GCR researchers, Harvard epidemiologists, etc. I think these poorly calibrated/examined claims can result in substantial material harms to people, in terms of stoking up unnecessary public panic, confusing accurate assessment of the situation, and creating 'boy who cried wolf' effects for future events. I've spent a lot of time on social media trying to get people to tone down their more extreme statements re: nCOV.
(edit: I do not mean this to refer to Justin's fermi estimate, which was on the more severe end but had clearly reasoned and transparent thinking behind it; more a broad comment on concerns re: ... (read more)
Following Sean here I'll also describe my motivation for taking the bet.
After Sean suggested the bet, I felt as if I had to take him up on it for group epistemic benefit; my hand was forced. Firstly, I wanted to get people to take the nCOV seriously and to think thoroughly about it (for the present case and for modelling possible future pandemics) - from an inside view model perspective the numbers I was getting are quite worrisome. I felt that if I didn't take him up on the bet people wouldn't take the issue as seriously, nor take explicitly modeling things themselves as seriously either. I was trying to socially counter what sometimes feels like a learned helplessness people have with respect to analyzing things or solving problems. Also, the EA community is especially clear thinking and I think a place like the EA forum is a good medium for problem solving around things like nCOV.
Secondly, I generally think that holding people in some sense accountable for their belief statements is a good thing (up to some caveats); it improves the collective epistemic process. In general I prefer exchanging detailed models in discussion rather than vague intuitions mediated by ... (read more)
To clarify a bit, I'm not in general against people betting on morally serious issues. I think it's possible that this particular bet is also well-justified, since there's a chance some people reading the post and thread might actually be trying to make decisions about how to devote time/resources to the issue. Making the bet might also cause other people to feel more "on their toes" in the future, when making potentially ungrounded public predictions, if they now feel like there's a greater chance someone might challenge them. So there are potential upsides, which could outweigh the downsides raised.
At the same time, though, I do find certain kinds of bets discomforting and expect a pretty large portion of people (esp. people without much EA exposure) to feel discomforted too. I think that the cases where I'm most likely to feel uncomfortable would be ones where:
The bet is about an ongoing, pretty concrete tragedy with non-hypothetical victims. One person "profits" if the victims become more numerous and suffer more.
The people making the bet aren't, even pretty indirectly, in a position to influence the management of the tragedy or the dedication of resources to it. It does
Do you think the bet would be less objectionable if Justin was able to increase the number of deaths?
I do think the "purely" matters a good bit here. While I would go as far as to argue that even purely financial motivations are fine (and should be leveraged for the public good when possible), I think in as much as I understand your perspective, it becomes a lot less bad if people are only partially motivated by making money (or gaining status within their community).
As a concrete example, I think large fractions of academia are motivated by wanting a sense of legacy and prestige (this includes large fractions of epidemiology, which is highly relevant to this situation). Those motivations also feel not fully great to me, and I would feel worried about an academic system that tries to purely operate on those motivations. However, I would similarly expect an academic system that does not recognize those motivations at all, bans all expressions of those sentiments, and does not build system that leverages them, to also fail quite disastrously.
I think in order to produce large-scale coordination, it is important to enable the leveraging a of a large variety of motivations, while also keeping them in check by ensuring at least a minimum level of more aligned motivations (or some other external systems that ensures partially aligned motivations still result in good outcomes).
I strongly disagree with this comment - I think that motivations matter and that betting with an appropriate respect for the people who have died is completely possible - but I am glad you stated your position explicitly. Comments like this make the Forum better.
I would similarly be curious to understand the level of downvoting of my comment offering to remove my comments in light of concerns raised and encouragement to consider doing so. This is by far the most downvoted comment I've ever had. This may just be an artefact of how my call for objections to removing my comments has manifested (I was anticipating posts stating an objection like Ben's and Habryka's, and for those to be upvoted if popular, but people may have simply expressed objection by downvoting the original offer). In that case that's fine.
Another possible explanation is an objection to me even making the offer in the first place. My steelman for this is that even the offer of self-censorship of certain practices in certain situations could be seen as coming at a very heavy cost to group epistemics. However from an individual-posting-to-forum perspective, this feels like an uncomfortable thing to be punished for. Posting possibly-controversial posts to a public forum has some unilateralist's curse elements to it: risk is distributed to the overall forum, and the person who posts the possibly-controversial thing is likely to be someone who deems the... (read more)
I also strongly object. I think public betting is one of the most valuable aspects of our culture, and would be deeply saddened to see these comments disappear (and more broadly as an outside observer, seeing them disappear would make me deeply concerned about the epistemic health of our community, since that norm is one of the things that actually keeps members of our community accountable for their professed beliefs)
My take is that this at this stage has been resolved in favour of "editing for tone but keeping the bet posts". I have done the editing for tone. I am happy with this outcome, I hope most others are too.
My own personal view is that I think public betting on beliefs is good - it's why I did it (both this time and in the past) and my preference is to continue doing so. However, my take is that that the discussion highlighted that in certain circumstances around betting (such as predictions on events such as an ongoing mass fatality event) it is worth being particularly careful about tone.
I strongly object to saying we're not allowed to bet on the most important questions - questions of life or death. That's like deciding to take the best person off the team defending the president. Don't handicap yourself when it matters most. This is the tool that stops us from just talking hot air and actually records which people are actually able to make correct predictions. These are some of the most important bets on the forum.
(Kind of just a nitpick)
I think I strongly agree with you on the value of being open to using betting in cases like these (at least in private, probably in public). And if you mean something like "Just in case anyone were to interpret Chi a certain way, I'd like to say that I strongly object to...", then I just fully agree with your comment.
But I think it's worth pointing out that no one said "we're not allowed to" do these bets - Chi's comment was just their personal view and recommendation, and had various hedges. At most it was saying "we shouldn't", which feels quite different from "we're not allowed to".
(Compare thinking that what someone is saying is racist and they really shouldn't have said it, vs actually taking away their platforms or preventing their speech - a much higher bar is needed for the latter.)
Personally, I don't see the bet itself as something that shouldn't have happened. I acknowledge that others could have the perspective Chi had, and can see why they would. But didn't feel that way myself, and I personally think that downside is outweighed by the upside of it being good for the community's epistemics - and this is not just for Justin and Sean, but also for people reading the comments, so that they can come to more informed views based on the views the betters' take and how strongly they hold them. (Therefore, there's value in it being public, I think - I also therefore would personally suggest the comments shouldn't be deleted, but it's up to Sean.)
But I did feel really weird reading "Pleasure doing business Justin!". I didn't really feel uncomfortable with the rest of the upbeat tone Sean notes, but perhaps that should've been toned down too. That tone isn't necessary for the benefits of the bet - it could be civil and polite but also neutral or sombre - and could create reputational issues for EA. (Plus it's probably just good to have more respectful/taking-things-seriously norms in cases like these, without having to always calculate the consequences of such norm
... (read more)Fwiw, the "pleasure doing business" line was the only part of your tone that struck me as off when I read the thread.
I did some research on hand hygiene and wrote a quick summary on Facebook and LessWrong if anyone is interested. Not sure it's really appropriate for a top-level post on the EA Forum but I do think it's pretty useful to know. Most people (including me a few days ago) are very bad at washing their hands.
For a week or so I have been fearing this potentially deadly disease spreading to most people on Earth (space-station and antarctic bases excepted), since the doubling time has been about half a week, and simple calculations show that even with a 1 week doubling time, half the Earth's population would get it by June. My fears were confirmed by reading of the John Hopkins Event 201 simulation last year, in which a 1 week doubling time virus spread throughout the world and killed tens of millions of people:
https://www.abc.net.au/news/2020-02-... (read more)
Some slightly positive evidence: By the 24th, 19 cases had been reported outside of China, with onset of symptoms usually before that. Given the most recent estimate of a mean incubation period of 5 days, it seems surprising that only 1 of the 19 cases has infected another person that we know of so far (a man traveling from Wuhan to Vietnam infected his son, who shared a hotel room with his father for 3 days). Since monitoring of people the infected came into contact with is high, finding infected people should be fairly quick.
Seems that effective contain... (read more)
The possibility of a long incubation period (and especially a long-ish pre-symptomatic infectiousness period) is especially worrying to me, as my impression is that this was a key reason SARS didn't take off more than it did.
That said, I'm not sure it's clear yet that there is a long pre-symptomatic period. This article suggests we're not really sure about this yet. I'm expecting to get more information very soon, though.
Update: "A WHO panel of 16 independent experts twice last week declined to declare an international emergency over the outbreak.
"While more cases have been emerging outside China in people who have travelled from there recently, the WHO said only one of the overseas cases involved human-to-human transmission."
https://www.theguardian.com/science/live/2020/jan/28/coronavirus-first-death-in-beijing-as-us-issues-new-china-travel-warning-live-updates?page=with:block-5e2ff1a58f0811db2faec898#block-5e2ff1a58f0811db2faec898
How confident are you that it affects mainly older people or those with preexisting health conditions? Are the stats solid now? I vaguely recall that SARS and MERS (possibly the relevant reference class), were age agnostic.
When comparing the novel coronavirus to the seasonal flu, it seems like the main differences are:
-the seasonal flu is typically around half as infectious (r0 typically 1.4 to 1.6)
-some strains of seasonal flu have a vaccine available; the coronavirus doesn't yet (although quick progress has apparently been made on the first steps)
But we believe that both seasonal flu and this coronavirus are similarly deadly, largely for the same segment of the population (older/at risk). Have I summarised this correctly?
I don't think this is a good summary for an important reason: I think the Wuhan Coronavirus is a few orders of magnitude more deadly than a normal seasonal flu. The mortality estimates for the Wuhan Coronavirus are in the single digit percentages, whereas this source tells me that the seasonal flu mortality rate is about 0.014%. [ETA: Sorry, it's closer to 0.1%, see Greg Colbourn's comment].
A better comparison would be to look at death rate for those infected: ~0.1% for seasonal flu.
No, the case fatality rate isn't actually 3%, that's the rate based on identified cases, and it's always higher than the true rate.
Estimate of swine flu fatality rate was ~0.5% in July 2009 with 100,000 cases reported. It ended up dropping over an order of magnitude.
The opposite trend occurred for SARS (in the same class as nCoV-2019), which originally had around a 2-5% deaths/cases rate but ended up with >10% once all cases ran their full course.
Note that there is now a Metaculus prize for questions and comments related to the coronavirus outbreak. Here you can see the existing questions in this series.
Some good background here: https://www.reddit.com/r/China_Flu/comments/exe552/coronavirus_faq_misconceptions_information_from_a (a)
I'am checking all stats nearly every hour at https://www.coronavirus-symptoms.info
Would love for someone to poke this & assess its epistemics: Coronavirus Contains "HIV Insertions", Stoking Fears Over Artificially Created Bioweapon (a)
I'm more curious about the trustworthiness of the scary graphs than about the claims that it may have been bioengineered.
Any thoughts on why the estimates here are so much higher than metaculus? Here they seem to range between 10 - 100 million, whilst the current metaculus median is 100k.
Maybe I've missed something.
How are you including age in this regression? It seems to me that ADRS resulting in ICU admission is a candidate for confounding with age.