Pablo | v1.24.0Jul 14th 2022 | (+189/-161) | ||
matthew.vandermerwe | v1.23.0Mar 5th 2022 | added a missing space | ||
Pablo | v1.22.0Jan 14th 2022 | (-173) | ||
Leo | v1.21.0Jan 14th 2022 | (+764/-992) | ||
Leo | v1.20.0Aug 20th 2021 | (+99/-119) Deleted a link from the body of the entry, which shouldn't be there as per style guide | ||
Arne | v1.19.0Aug 20th 2021 | (+114/-33) | ||
Leo | v1.18.0Aug 1st 2021 | (+9) | ||
Leo | v1.17.0Jul 15th 2021 | |||
Leo | v1.16.0Jun 30th 2021 | (+7/-7) | ||
Leo | v1.15.0May 16th 2021 | (+99/-26) |
Beckstead, Nick (2013) On the overwhelming importance of shaping the far future, doctoral dissertation, Rutgers University Department of Philosophy.Justifies its importance.
Moral patienthood should not be confused with moral agency (see Wikipedia 2004).agency.[1] For example, we might think that a baby lacks moral agency - it lacks the ability to judge right from wrong, and to act on the basis of reasons - but that it is still a moral patient, in the sense that those with moral agency should care about their well-being.
First, which entities can have well-being? A majority of scientists now agree that many non-human animals, including mammals, birds, and fish, are conscious and capable of feeling pain (Low et al. 2012),[2] but this claim is more contentious in philosophy (Allen & Trestman 2016).philosophy.[3] This question is vital for assessing the value of interventions aimed at improving farm and/or wild animal welfare. A smaller but growing field of study considers whether artificial intelligences might be conscious in morally relevant ways (Wikipedia 2003).ways.[4]
Second, whose well-being do we care about? Some have argued that future beings have less value, even though they will be just as conscious as today’s beings are now. This reduction could be assessed in the form of a discount rate on future value, so that experiences occurring one year from now are worth, say, 3% less than they do at present. Alternatively, it could be assessed by valuing individuals who do not yet exist less than current beings, for reasons related to the non-identity problem (Robert 2019; [5] (see also population ethics). It is contentious whether these approaches are correct. Moreover, in light of the astronomical number of individuals who could potentially exist in the future, assigning some value to future people implies that virtually all value—at least for welfarist theories—will reside in the far future (Bostrom 2009; [6] (see also longtermism).
Allen, Colin & Michael Trestman (2016) Animal consciousness, in Edward Zalta (ed.), Stanford Encyclopedia of Philosophy.Discusses similar questions from a philosophical perspective.
Bostrom, Nick (2009) Astronomical waste: the opportunity cost of delayed technological development, Utilitas 15(3), pp. 308-314.
Low, Philip et al. (2012) The Cambridge declaration on consciousness, Francis Crick Memorial Conference, July 7.Declares that animals are capable of consciousness, from a group of leading scientists.
Roberts, M. A. (2019) The nonidentity problem, in Edward Zalta (ed.), Stanford Encyclopedia of Philosophy.
Wikipedia (2003) Artificial consciousness, Wikipedia, March 13 (updated 24 April 2021).
Wikipedia (2004) Distinction between moral agency and moral patienthood, in 'Moral agency', Wikipedia, September 25 (updated 14 November 2020).
Wikipedia (2004) Distinction between moral agency and moral patienthood, in 'Moral agency', Wikipedia, September 25 (updated 14 November 2020).
Low, Philip et al. (2012) The Cambridge declaration on consciousness, Francis Crick Memorial Conference, July 7.
Allen, Colin & Michael Trestman (2016) Animal consciousness, in Edward Zalta (ed.), Stanford Encyclopedia of Philosophy.
Wikipedia (2003) Artificial consciousness, Wikipedia, March 13 (updated 24 April 2021).
Roberts, M. A. (2019) The nonidentity problem, in Edward Zalta (ed.), Stanford Encyclopedia of Philosophy.
Bostrom, Nick (2009) Astronomical waste: the opportunity cost of delayed technological development, Utilitas 15(3), pp. 308-314.
Moral patienthood should not be confused with [moral agency] (https://en.wikipedia.org/wiki/Moral_agency#Distinction_between_moral_agency_and_moral_patienthood)moral agency (see Wikipedia 2004). For example, we might think that a baby lacks moral agency - it lacks the ability to judge right from wrong, and to act on the basis of reasons - but that it is still a moral patient, in the sense that those with moral agency should care about their well-being.
Wikipedia (2004) MoralDistinction between moral agency and moral patienthood, in 'Moral agency', Wikipedia, September 25 (updated 14 November 2020).
Moral patienthood should not be confused with moral agency (see Wikipedia 2004)[moral agency] (https://en.wikipedia.org/wiki/Moral_agency#Distinction_between_moral_agency_and_moral_patienthood). For example, we might think that a baby lacks moral agency - it lacks the ability to judge right from wrong, and to act on the basis of reasons - but that it is still a moral patient, in the sense that those with moral agency should care about their well-being.
Second, whose well-being do we care about? Some have argued that future beings have less value, even though they will be just as conscious as today’s beings are now. This reduction could be assessed in the form of a discount rate on future value, so that experiences occurring one year from now are worth, say, 3% less than they do at present. Alternatively, it could be assessed by valuing individuals who do not yet exist less than current beings, for reasons related to the non-identity problem (Robert 2019; see also population ethics). It is contentious whether these approaches are correct. Moreover, in light of the astronomical number of individuals who could potentially exist in the future,future, assigning some value to future people implies that virtually all value—at least for welfarist theories—will reside in the far future (Bostrom 2009; see also longtermism).
Moral patienthood should not be confused with moral agency (see Wikipedia 2021b)2004). For example, we might think that a baby lacks moral agency - it lacks the ability to judge right from wrong, and to act on the basis of reasons - but that it is still a moral patient, in the sense that those with moral agency should care about their well-being.
First, which entities can have well-being? A majority of scientists now agree that many non-human animals, including mammals, birds, and fish, are conscious and capable of feeling pain (Low et al. 2012), but this claim is more contentious in philosophy (Allen & Trestman 2016). This question is vital for assessing the value of interventions aimed at improving farm and/or wild animal welfare. A smaller but growing field of study considers whether artificial intelligences might be conscious in morally relevant ways (Wikipedia 2021a)2003).
Wikipedia (2021a)(2003) Artificial consciousness, Wikipedia, March 13 (updated 24 April 2021).
Wikipedia (2021b)(2004) Moral agency, Wikipedia, September 25 (updated 14 November 2020).
A beingMoral patienthood isathe condition of deserving moral consideration. A moral patientif they are included in a theory of the good (also known asis anaxiologyor theory of value).entity that possesses moral patienthood.While it is normally agreed that typical humans are moral
patients in this sense,patients, there is debate about the patienthood of many other types of beings, including human embryos, non-humananimals,animals, futurepeople,people, andnon-biological sentients.digital sentients.