willbradshaw

willbradshaw's Comments

willbradshaw's Shortform

[CN: Death and suffering. Crossposted from Facebook.]

As the flow of coronavirus death stories in the UK has gradually increased over the past month, I've been trying to make some positive use of the identifiable victim effect: looking at the faces, feeling how sorry I am for those people and their families, and trying to generalise that empathy to the rest of the world.

So many people are already suffering and dying because of this virus. So many more will suffer and die around the world before this is over. The burden of this disease will be vast. The burden of all the things we're doing to reduce that first burden may be vaster still.

I will never know or even vaguely imagine most of the people this crisis will touch. Most of them will be far away from me, in developing countries with fragile healthcare systems and poor reporting. Most will be old, or sick, but even among the young and healthy the number of the dead and (possibly) disabled will be large. And they won't be gentle deaths, either, especially in the absence of modern medical care; there are worse ways to die, but there are also far better ones.

To all the victims of this disaster that I will never meet: I am so, so sorry this happened to you. I am so sorry we, collectively, were too late when you needed us, that we are still so exposed to the boundless viciousness of nature.

I am planning to spend most of my time over the next few months working on things other than the current emergency, because there are other, worse, future emergencies I want to help avert. And even now, in the midst of this crisis, there are still things in the world I suspect cause even more death and suffering than the coronavirus will. But that doesn't make what's happening now any less monstrous.

Why not give 90%?

I'd agree, but want to say slightly more about the appropriate attitude as well as the right action - something along the lines of non-destructive and non-aggrandising regret.

Out of interest, do you think this attitude for consequentialist reasons (e.g. such an attitude will lead to greater effort devoted towards self-improvement / pruning of not-actually-needed luxuries) or non-consequentialist ones (it's just inherently blameworthy to really want a sports car when children in Africa are starving)?

needing to sleep (even if she needs to sleep more than others) is not a blameworthy trait

It's not really clear to me why a need for sleep is not blameworthy while a psychological attachment to luxuries is. One need is universal, while the other is particular, but I'm not sure that matters per se? And even that distinction breaks down if you posit that Agape needs more sleep than other people.

I think you could make the claim that in reality there is a difference in your ability to affect your future self's attitude to luxuries (e.g. by incrementally weaning yourself off them, cultivating mindfulness, etc.), such that regret is more useful in one case than the other, but if we assume ex hypothesi that that isn't the case (Agape's desire for a sports care is deep-seated and unshakeable) then I'm not sure whence the difference in blameworthiness comes.

Should recent events make us more or less concerned about biorisk?

I think you're probably right that society is likely to respond by increasing our ability to respond to natural pandemics in various ways. There's a lot of great people who are now way more interested in pandemics than they were before.

(Come to think of it, putting some thought now into how to mobilise those forces to avert the next pandemic is probably warranted, since I think there's a pretty good chance all that energy dissipates without much to show for it within a few years of this pandemic ending.)

When it comes to biorisk as a whole, the picture is less clear (though my guess is still probably positive?). There does seem to be some danger that people neglect considerations around engineered pandemics (DURC, info hazards, etc.) in their rush to tackle natural pandemics. I think a lot of work done on the latter is still useful for preventing the former, but they don't always run in the same direction, and since engineered pandemics seem to be the greatest concern from a longtermist perspective, this could be a significant concern.

Should recent events make us more or less concerned about biorisk?

Is this a cunning scheme to ask private questions on the Forum, or is this actually going to go public at some point? :P

What are the key ongoing debates in EA?

Thinking about this more, I suspect a lot of people would agree that some more general statement, like "What important cause areas is EA missing out on?" is a key ongoing debate, while being sceptical about most specific claimants to that status (because if most people weren't sceptical, EA wouldn't be missing out on that cause area).

What are the key ongoing debates in EA?

I simultaneously have some sympathy for this view and think that people responding to this question by pushing their pet cause areas aren't engaging well with the question as I understand it.

For example, I think that anti-ageing research is probably significantly underrated by EAs in general and would happily push for it in a question like "what cause areas are underrated by EAs", but would not (and have not) reference it here as a "key ongoing debate in EA", because I recognise that many people who aren't already convinced wouldn't consider it such.

So one criterion I might use would be whether disputants on both sides would consider the debate to be key.

I also agree with point (2) of Khorton's response to this.

Toby Ord’s ‘The Precipice’ is published!

I'd be fairly surprised if the answer wasn't "we dropped the footnotes" since this is almost always the answer. If that is not the answer I'd also be curious about how it was managed.

What are the key ongoing debates in EA?

I suspect this may be evidence in itself that this is not currently a key ongoing debate in EA.

EAF/FRI are now the Center on Long-Term Risk (CLR)

Sadly my informal goal of having an EA-related F[?]I organisation for every letter of the alphabet has taken a step back. :-(

EAF/FRI are now the Center on Long-Term Risk (CLR)

This seems like a better fit for CEEALAR than CLR?

Load More