DM

David Mathers🔸

4204 karmaJoined

Posts
9

Sorted by New

Comments
432

What you really want to look at-I haven't properly-is the literature on what determines pain intensity, and, relatedly, what makes pain feel bad rather than good, and what makes pain morally bad. That'll tell you something about how friendly current theory is to "more neurons=more intense experiences", even if the papers in the literature don't specifically discuss whether that is true.   

Strong *upvote* the comment yes. But there is no strong agreement vote. 

Wish there was a way to vote strong agree and not just agree on this comment.

Ok, maybe I was too fast to take the definition I remember from undergrad 20 years ago as the only one in use! 

Yeah, that's a fair point, maybe I haven't seen it because no one has considered how to do the weighting at all outside EA. But my sense is that at the very least many theories are unfriendly to weighting by neuron count (though probably not all). 

I can't be sure that they aren't somewhere as "philosophy of consciousness", let alone cognitive science is a surprisingly big field, and this is not what I specialised in directly. But I have never seen it proprosed in a paper (though I haven't deliberately searched.) 

I am far from an unbiased party since I briefly worked on the moral weight project as a (paid) intern for a couple of months, but for what it's worth, as a philosophy of consciousness PhD, it's not just that I, personally, from an inside point of view, think weighting by neuron count is bad idea, it's that I can't think of any philosopher or scientist who maintains that "more neurons make for more intense experiences", or any philosophical or scientific theory of consciousness that clearly supports this. The only places I've ever encountered the view is EAs, usually without formal background in philosophy of mind or cognitive science, defending focusing near-termist EA money on humans. (Neuron count might correlate with other stuff we care about apart from experience intensity of course, and I'm not a pure hedonist.) 

 

For one thing, unless you are a mind-body dualist-and the majority of philosophers are not-it doesn't make sense to think of pain as like some sort of stuff/substance like water or air, that the brain can produce more or less of. And I think that sort of picture lies behind the intuitive appeal of more neurons=more intense experiences. 

That's not what "speciesism" means. Speciesim isn't the view that an individual human matters more than animals, it's the view that humans matter more because they are human, and not because of some objectively important capacity. Singer who popularized the term speciesism (though he didn't invent it) has never denied that a (typical, non-infant) human should be saved over a single animal. 

Bees feel like an easy case for thinking RP might be wildly wrong in a way that doesn't generalise to all animal interventions, since bees might not be conscious at all, whereas it's much less likely that pigs or even chickens aren't. (I'm actually a bit more sympathetic to pigs not being conscious than most people are, but I still think its >50% likely that they are conscious enough to count as moral patients.) 

I think this post is overall great, even though I favour animal over global health stuff right now, but doing stuff just for the optics feels really sleazy and naive utilitarian to me. 

Load more