This is a quick write-up of a 'person-affecting' & preference utilitarian idea that came to my mind (a normally totally total and hedonistic utilitarian). I don't think this is worth pursuing due to x-risk related priorities seeming overwhelming more important but I thought the idea was sufficiently interesting to share.
TLDR: Maybe in future we will want to "revive" past people by simulating a lot of digital people such that many of those digital people will be more or less identical to many past people? The intuition is that this way we can partially redeem the tragically suboptimal lives that specific individuals today and in the past had to endure.
A friend of mine once mentioned that he’s leaving as many of his thoughts in recoverable digital form so it would be more feasible to reconstruct his mind digitally at some point in the future. This would allow a very close digital version of him to live a fulfilled utopian life even after his biological body and mind will have disappeared.
My reaction back then was to doubt that his written words would allow a particularly close copy, such that this whole idea is more or less irrelevant as humanity would not be able to realistically recover people and there’s consequently less reason to spend large amounts of resources on this in the future. But the idea of allowing past people, especially those who had to experience a lot of tragedy and suffering and unfulfilled potential, to live another time under conditions very conducive to a fulfilled life stuck with me.
Feasibility of recovering current and past people without perfect brain scans
Recently I reconsidered the feasibility of recovering specific past people digitally because of three thoughts that randomly crossed my mind:
A) I hadn’t considered before that a utopian version of humanity might be able to spend astronomical amounts of compute on generating a lot of close-by copies of past humans, such that at least one of the copies will likely be a very close version of the actual person.
B) I hadn’t considered how much information about our minds is shared among all humans, such that for each person there is already a reasonable baseload of information available.
C) I hadn’t considered that recovering the DNA of a person probably would supply a decent chunk of information about the mind of that person.
Many people today leave fairly big digital footprints via things like chats with friends, pictures, personal notes, recorded behaviour on platforms such as Google and Facebook, and additionally genetic codes. My completely made-up guess is that such a person today could be reasonably considered recovered if there are somewhere between one thousand to one million plausible digital copies created for him or her in future? A lot of details about the person will never be instantiated, but the conditions of an ethical redemption for a given past person might still be met? One criterion might be that the past person would likely have said that they recognize themself in one of the digital minds after doing a very extensive assisted Ensuring test.
Some open questions
- Has this topic been discussed somewhere?
- How likely can you recover past and currently living people based on the available information such that one can honestly say that this past person has been recovered to live the life that this person would find very fulfilling?
- I imagine there might be some function that estimates a rough interaction of i) the amount of information available for a given person, and ii) the number of copies that would make successful recovery very likely.
- Related to 2., how much compute would it cost to recover people from the past?
- Maybe some fraction of the people from the past hundred years could be recovered fairly “cheaply” based on written notes and diaries, reports from surviving relatives, pictures, genetic samples, biographical knowledge. In contrast, people whose genes we can’t recover and who didn’t leave much autobiographical text might be very expensive to allow another life.
- How much less good would the simulated past people’s lives be compared to the kinds of lives and minds that will be possible in the future?
- If the difference in quality of experiences would be big, then simulating past lives would come with immense opportunity costs.
- E.g. I can imagine that many past people were shaped into personalities that are not conducive for living a happy live. But I also imagine that those people would than choose to change their minds into forms that are conducive for their flourishing.
- Is “righting past wrongs” a reasonable ethical position?
- I’m fairly sympathetic to the idea that personal identity is not a helpful and reality-carving concept and often leads to confusion.
- But I also have a strong intuition that I would support any person whose life has been cut short due to unjust circumstances (e.g. murder or disease) or whose live was largely unfulfilling and stricken with suffering to be offered a second chance.
- Are there some cheap options available today to enable recovering more people?
- E.g. one might preserve DNA samples of people who might die soon or who have died in the past, one might invest more in preserving personal documents of the deceased in some form, one might invest in a service to file the memories of people.
Thanks Jasper and Tilman for feedback, I will try a little harder to preserve my memories of both of your wonderful souls, just in case.
I'm generally skeptical of digital people, but I think you can almost certainly say that meaningfully recreating a person without access to brain scans far beyond existing technology would be impossible. Even if you accept the philosophical assumptions necessary to believe that a recreated digital person is morally continuous with the person it's based on, the only way to recreate a digital person without access to that kind of scan is by reversing entropy and recovering the lost information. No matter how many possible versions of the person you make based on incomplete information, you could never have enough compute to find one that matches, or the information to identify which one is correct. There are 8x1067 ways to order a deck of 52 cards. How many viable networks do you think there are of 82 billion neurons?
At a philosophical level, I don't really find it very convincing that even a perfect recovery/replica would be righting any wrongs experienced by the subject in the past, but I can't definitively explain why - only that I don't think replicas are 'the same lives' as the original or really meaningfully connected to them in any moral way. For example, if I cloned you absolutely perfectly now, and then said, I'm going to torture you for the rest of your life, but don't worry, your clone will be experiencing eqaul and opposite pleasures, would you think this is good (or evens out) for you as the single subject being tortured, and would it correct for the injustice being done to you as a subject experiencing the torture? All that is being done is making a new person and giving them a different experience to the other one.
Thanks for the pushback, it clarified my thinking further.
I think this thought experiment introduces more complexities that the scenario in the post avoids, e.g. having to weigh suffering vs. happiness. In the original scenario the torture/suboptimal life already would have happened to me, and now the question is whether it's better from a moral sense to have a future filled with tons of happy fulfilled lives vs. one where one of those lives is lived by somebody that is basically me. And my intuition is, that I'd feel much better knowing that what "I" am, my hopes, dreams, basic drives, etc. will be fulfilled at some point in the future despite having been first instantiated in a world where those hopes etc. where tragically crushed.
So my intuition here probably comes more from a preference utilitarian perspective, where I want the preferences of specific minds to be fulfilled, and this would be somewhat possible by having a future close version of yourself with almost identical preferences/hopes/desires/affections etc.
Good discussion. My intuition is that if you have a close enough copy that shares the same memories as you, it would feel like it was you (i.e. be you). So say you resurrected people and made it so that they felt like a continuation of their previous selves. Perhaps if (in their original life) they got cancer and died young, they would instead remember being miraculously cured, or something. Even if there were multiple copies, they would all essentially be you (subjectively feel like you), just branched from the original (i.e share a common history).
If there are no shared memories, then effectively it wouldn't be much different than standard Open Individualism - i.e. you are already everyone, but just not directly experientially aware of the link. The fulfilling of preferences seems somewhat incomplete unless the original people know about it. Like you'd need the simulator somehow letting them know before they die that they will live again, or something (this is starting to sound religious :)).
Also, perhaps an easier route for all this is cryonics :)
I'm also very sympathetic to a preference utilitarian perspective, much more so than just suffering vs. happiness. But to me the preference satisfaction comes from the realised state of the world actually being as desired, and not from specifically experiencing that satisfaction. For example, people will willingly die in the name of furthering a cause they want to see realised, knowing full well they will not experience it. One would consider it something of a compensation for their sacrifice if their goals are realised after, or especially because of, their death.
Similarly, I think it would help to right past wrongs if, in the future, the past person's desired state of the world comes to pass. But I still don't see how it is any better for that person, or somehow corrected further, if some replica of their self experiences it.
One might imagine that the overall state of the world is more positive because there is this replica that is really ecstatic about their preferences being realised and being able to experience it, but specifically in terms of righting the wrong I don't think it has added anything. They are not the same subject as the one who experienced the wrong - so it does not correct for their specific experience - and the payout is in any case in the realised state of the world and not in that past subject having to experience it.
I think where my intuitions diverge is that I expect many people to have a lot of self-directed preferences that I regard as ethically on the same footing as non-self directed preferences: It seems you're mostly considering states of the world like ensuring the survival and flourishing of their loved ones, or justice happening for crimes against humanity, or an evil government being overthrown and replaced by a democracy. But I'd guess this class of preferences should not be so distinct from people wanting the state of the world in future including themselves being happy, with a loving partner and family, friends and a community that holds him or her in high regard. And that's why I feel like a past person would feel at least a little redeemed if they knew that in some future time they would see themselves living the fulfilled live that the past selves wished they could've enjoyed.
Ah I see, yes that seems to make a meaningful difference regarding the need to have the self experience it then. Although I would still question if having the replica achieves this. If we go to the clone example, if I clone you now with all your thoughts and desires and you remain unsatisfied, but I tell you that your clone is - contemporaneous with your continued existence - living a life in which all your desires are satisfied, would you find that satisfying? For me at least that would not be satisfying or reassuring at all. I don't see a principled way in which stretching the replication process over time so that you no longer exist when the copy is created suddenly changes this. The preference would seem to be that the person's subjective experience is different in the ways that they hope for, but all that is being done is creating an additional and alternative subjective experience that is like theirs, which experiences the good things instead.
Yeah, I think it's a good point that stretching the replication process over time seems kind of arbitrary and might making the existence of the replica and yourself contemporaneous reduces the intuition that it is "you" who gets to live the life you wished for.
At the same time my personal intuitions (which are often different from other reasonable people :D) are actually not reduced much by the thought of a replicated copy of myself living at the same time. E.g. if I now think about a 1:1 copy of mine living a fullfilled life with his wife and children in a "parallel universe", I feel more deeply happy about this than thinking about the same scenario for friends or strangers.
Ha well, I think you might find a fair few people share your intuition, especially in some strands of EA that intersect with transhumanism.
I don't personally share the intuition, but I think if I did then it would also make sense to me that I would expect the replica's satisfaction would be correspondingly reduced to the extent they know some other self that they are identified with is or was not satisfied. But I appreciate at this point we're just getting to conflicting intuitions!
A clone wouldn’t have the same consciousness , so that’s a bad deal. But for whatever reason, people have a sense of a personal identity across time. I am fully willing to make inter temporal trade offs. It seems more just to make up for past injustices.
Whether or not you could in theory create a replica of a person which has the same consciousness isn’t necessary clear. If you’re entirely a physicalist and believe in computational theory of mind, what reason is there for you not to believe you could recreate a persons consciousness? Just exactly replicate all their brain processes.
'If you’re entirely a physicalist and believe in computational theory of mind, what reason is there for you not to believe you could recreate a persons consciousness? Just exactly replicate all their brain processes.' This is confusing 2 different kinds of identity, "qualitative" and "numerical". :
Qualitative identity=I can have 2 different (qualitatively) identical apples, if one is a perfect duplicate of the other
Numerical identity=X is numerically identical to Y, if they're the same object, for example 'the morning star' and 'the evening star' are numerically identical, since these are both old names for the planet Venus.
What physicalism implies is that if you build someone who has all the same physical properties as me then they will be qualitatively identical to me, full-stop, because physicalism just is the view that all properties of things are fixed by their physical properties. But that doesn't automatically mean they'd be numerically identical to me, any more than if I create a perfect duplicate of an apple, their both the same apple. Common-sense says 'no they are not the same apple, because I started with only one apple and now I have 2, and if there are 2 apples, they are (numerically) distinct from each other'. You could of course have a theory that 'same person' is special, in that any perfect duplicate of me just is me. But I don't think that is very plausible: build a perfect duplicate of me while I am alive, and it seems like you have two (qualitatively) identical people, not just one person who is somehow in 2 places at once.
I think some people are confused about this because they've heard philosophers have "psychological" theories of personal identity, where if the informational contents of your brain get wiped and moved to another new brain, then you are the person with the new brain. But actually, the theories that philosophers take seriously which imply this don't say that if two people have exactly the same mental properties, they must be the same person. What they say is that if there's a future person who's psychological state depends on your current state in the right way, then that future person is you*, and they then combined this with the idea that if info is deliberately transferred from your brain to another brain, this is a connection of the right sort for the person with the new brain to count as you.
*Actually, it's a little more complicated than that: you need to add a clause saying 'and no other person at the same point in the future has mental states that depend on yours in the right way'. Can't have 2 future people who are identical to you but not each other. That's the key insight behind Derek Parfit's famous argument that there are situations as selfishly good as survival for you but where you cease to exist: this happens when there are multiple duplicates of you whose mental states are each connected to yours in the right way.
See this post and prior discussion: https://forum.effectivealtruism.org/posts/3jgpAjRoP6FbeESxg/curing-past-sufferings-and-preventing-s-risks-via-indexical
Maybe you can duplicate people and preserve their identities without also putting them in their original horrible situations, since that could cause immense suffering and be horrible on most other views.
The book series "Hyperion Cantos" by Dan Simmons explores what this might be like for the digital people being recreated.
Some of the interesting ideas that the series explores around this topic:
I'm also reminded of Ted Chiang's Exhalation and Marc Stiegler's The Gentle Seduction.
I'm not quite sure how it would relate to Exhalation the story itself but I do think it is sort of related to "Anxiety is the Dizziness of Freedom" in that same collection. I have not read The Gentle Seduction but it looks intriguing!
Re Exhalation, some of the comments at the end, but I don't want to give spoilers :) Haven't reached Anxiety is the Dizziness of Freedom yet but looking forward to it.
I was sobbing reading The Gentle Seduction - you've been warned! I found it here, incidentally.
I've thought about questions like this to some extent. For my moral philosophy, I think it would be morally better to recreate a once-existing person.
I do not think that personal thoughts would be sufficient to reverse engineer a persons brain. Preserving DNA would probably be much better, but still insufficient. Really good brain scans might do the trick. Really really not sure.
It may just happen that you get reincarnated even without trying. If time is infinite and you have a theory that allows the recreation of people, you would expect to be born again. When I die, I might just wake up in a new body in a new world. (https://philpapers.org/archive/HUEEIE.pdf)
It seems like if simulated people can exist and live blissful lives and there will be trillions and trillions, then I should be living as a simulation. The fact that my life is good but not entirely blissful is perhaps evidence against utilitarianism or that simulating minds is possible. This depends on what your view on observer selection effects (https://www.lesswrong.com/tag/observation-selection-effect).
Anyway, very interesting thoughts. This stuff is cool but hard to think about.
Thanks for the thoughts and pointers. I hadn't considered this anthropics connection, interesting thought.
Thanks. I wrote about it here: https://parrhesia.substack.com/p/utilitarianism-casts-doubt-on-the. I don't really hold to these ideas very strongly. Just something to consider.