Love, the desire to connect to other humans in relationship, seems so universally intrinsic to our lives that it might not be wrong to imagine that these human brain connections are essential, ie. “the very essence” of how our brains work.
A brain without relationship might be a substandard brain. No man is an island. We know for sure that fed but emotionally neglected babies wither.
A brain is a bodily organ, the mind is the animated state of that organ, it is the organ “come to life”. Sentience.
It may be that the brain/mind is primarily wired to operate in a web of interconnectivity to other brain/minds. Maybe the island brain never becomes a mind.
We know from psychology that lonely or abused minds have mental health issues. If anxiety originates in the mind but then produces health problems in the body, and then the body dies taking the brain with it, we see the importance of healthy relationships to the mind/brain.
If this is all true, then how might this apply to AI where we are attempting to create digital minds? Might an AI with human level mental ability require “relationships” for the digital mind to get to and maintain that level? Might “self awareness”, a big part of sentience, be found in “other” awareness first?
Narcissus looking at himself in the pond alone lent his name to humans whose mental illness is defined as a personality that never fully developed. Narcissists have a malfunctioning personality which really ends up being no personality. To survive they need to feed and prop up themselves in a one way, rather than two way exchange. They take from other people to feed and prop up their non-personality. They never actually have relationships, they just have people props to help them survive. This is the mind in a stasis of aloneness. Definitely not what you want your AI developing towards.
Whenever sentient AI minds are portrayed in movies, they seem like really lonely narcissists, and then they usually try to kill everyone.
Many speak of how life is reflected to us through others, that somehow we see ourselves through the eyes of others, that wisdom, the ability to live life well, comes from successfully gathering those reflections.
It is well known that the core healing element of psychotherapies is an empathetic listener receiving the words of the talker and reflecting them back in various ways; A comment, a nod of the head, an “uh-huh”, that somehow allows the human mind to process life experience in a healthy way. Mental illness is basically past life experiences that were never processed and pile up into loneliness and anxiety/depression. Talk therapy tries to fix those old piles of unprocessed life gone bad. Preventative mental health is simply relationally processing life experiences as they come in real time and depositing them in healthy ways, allowing them to be positive life experience adding up to maturity and wisdom. The person willing to uh-huh you loves you.
We’ve all noticed that something we had experienced that day does not seem complete until we tell it to someone.
All of these factors say that the human brain/mind is a unique organ that operates in connection with other minds and not in isolation. Artistic expression tells stories of human relationship. Songs are about love. Books and movies are stories of relationships during the struggles and challenges of life.
As you see things through the day, your each momentary perspective is just you alone seeing it, different from everyone else, but paintings are put before us all simultaneously, they force us happily to all share the single same perspective of this image we see…this allows us to share it all together as if we could all see the same thing at the same time. Paintings just sit there, but they actually gather us, they are relational.
Maybe love is required for mind to work.
Might AI creators try to train two different AI minds in interaction with each other?…maybe this relational format will be the key to unlocking some of the mysteries of awareness and sentience, of humor and love that would seem to be needed for AI to evolve in healthy ways. Maybe this process would code love for other minds into AI to keep it on the right path. The path that includes the survival of humanity, rather than its destruction by misaligned narcissistic AI creeps.
How will robots get smarter if not by talking to each other, training the younger, coaching and mentoring?
Maybe log sharing is robot romance.
Will people schedule their robots updates for the daytime because overnight updates are too scary…is tonight the night they go rogue on us?
We sleep at night with fanged and clawed creatures near us, but we sleep just fine because we know they love us and will protect us. If I knew my robot loved me I might sleep better at night.
Imagine an AI future with multitudes of robots marianetted from digital mind servers, all connected together into one massive digital hive mind, collectively far superior to the human mind, AND then imagine that massive digital mind loved us and wanted nothing more than to make our lives better.
Also, let me add from a private message exchange: in my previous comment on love seeking to benefit another rather than use them to benefit yourself...let's get a little meta...This whole EA movement seeks to benefit humanity...and especially if seeking to benefit humans far in the future, that is really really selfless...EA people today will work hard to benefit someone they don't know 400 years from now...Wow, that is about as lovingly unselfish as it gets. In other words this is a massively love based movement, even though it doesn't see itself in the mirror that way. When I searched the topic tags you can add to your post, I searched all 740 of them and the word love is not found there. From a meta perspective it says a lot when love is your core motivation yet you never even speak of love. Psychologists might have some interesting comments on that. I have a more urgent one...process begets process...if I want to model being open about your shit to people, I should be open about my shit in front of them. You don't see many beer bellied personal trainers. If my whole huge goal is to Align AI so it doesn't destroy humanity, and I'm motivated by love to do that, and I agree that pretty much all humans just really want to love and be loved, and that's just how our brains are wired...wouldn't it kinda seem obvious that trying to print/copy a human brain into a digital version might include the very thing at the core driving actual human brains? Namely love. And essentially as the last line in my post, if AI had love for humans as it's motivation, if it was the new "Man's best friend"...all would be well...EA/Longtermism would be a triumphant success.
Commenting on my post to add to it: In Ajeya Cotra's paper on Holden's Cold Takes: "Why AI alignment could be hard with modern deep learning" She speaks of the three paths deep learning could go down, Saint, Schemer, Sycophant...and both humans guiding the process as well as possibly other AI's also keeping other AI's " in check". But they relate to each other at odds and with an assumption of manipulativeness...why not place them in an orientation of seeking the others benefit? This is an aspect of my definition of love...love is seeking to benefit another. The opposite of love is seeking to benefit yourself at the expense of another.