This subject is one of the most controversial among neuroscientists and AI developers. In my view, Mind uploading will at some stage be possible as will be a digital consciousness. However, I do not believe it would be possible to make an absolutely exact copy of a mind for two reasons. Firstly, recording of thoughts, emotions and memories is not an absolutely error-free process. There is an environmental noise at a synapse level and in the brain in general. Secondly, since at the neuron level information is passed on as an electric current, it generates electromagnetic waves. As such electromagnetic waves are subject to quantum mechanics uncertainty theory, which works on probabilities rather than certainties. Therefore, a copy of the mind may be very close to the original, but not an exact one. Here is my take on it:
A partial solution might be a kind of synchronization of the original, wet’ brain and its digital copy.
Another one, perhaps the most realistic, is to gradually implant more and more electrodes (see the latest developments by Neuralink) until a person has in fact a dual brain: the wet brain and the overlaid network of billions of electrodes that are a kind of a shadow of the real brain. At some stage, a digital brain become the original and the ‘wet’ brain becomes a shadow, until at some stage there is a cut off and a digital mind functions on its own, having a number of synchronized backups.
Finally, once we have developed a full model of Connectome, we can try to populate a generic model with a specific representation of the mind of a particular person.
But this is my view. Others have a more optimistic view, believing that making a perfect copy of a mind is possible, like in this article by Cave Johnson below.
“The point is, if we can store music on a compact disc, why can’t we store a man’s intelligence and personality on one? So, I have the engineers figuring that one out now.”
Artificial intelligence is hard. Why reinvent the wheel, when you’ve got plenty of humans walking around? Who will miss one, right? Alternatively, you might be one of those humans looking for easy immortality. Either way, once you finish scanning the brain, you end up with a file that you run in a physics simulator, and presto, you have a computer that remembers being a human. If you do it carefully enough, the original brain won’t even notice it happening.
This computer has a number of advantages over a meat human. The simulation can be run many thousands of times faster than objective speed, if you’ve got enough computing power. It can be backed up with trivial ease. You can run multiple copies at the same time, and have them do different things, make exotic personality composites, and tinker around with the inner workings of the brain in ways that are either difficult or impossible to do with a meat brain. Additionally, there’s the fact that it’s impossible to kill as long as its data is backed up somewhere and there exists a computer on which to run it – you can just restart the simulation wherever you left off and the mind won’t even recognize it.
Critics of the concept are quick to point out that it presupposes an understanding of neurology (not just human neurology, but even the neurology of a common insect) far, far beyond what currently exists; and that without such knowledge, even the most powerful computer cannot do this. Proponents of the idea assure us that this knowledge is coming. Proponents who hope to live to see and actually benefit from it assure us that it’s coming really, really soon.
As with The Singularity, the idea of brain uploading has inevitably taken on a quasi-religious aspect for many in recent years, since it does promise immortality of a sort (as long as your backups and the hardware to run them on are safe), and even transcendence of the body.
The advantages bestowed by brain uploading are a bit overwhelming if you’re trying to incorporate them into a story. It kind of kills the tension when the protagonist can restore from backup whenever the Big Bad kills them. Authors have devised a number of cop-outs, which you can recognize by asking these questions:
What is the underlying mechanism of the upload? Is the computer simulating every atom in every neuron, or is the upload applying memories and personality characteristics to a default template?
Is uploading destructive? Depending on which process you use, it may be possible to do it nondestructively, but many authors deem it convenient to have it destroy the original, to avoid the confusion of having two copies of the same character running around.note
Can you augment intelligence? Or does the brain’s pattern need to be copied exactly to still function like a mind, leaving no room for radical enhancements?
Can the upload be copied? If the answer is “no”, the work might be on the soft end of Mohs Scale of Sci-Fi Hardness, although it’s also possible to make it a little harder by running the AIs on a quantum computer and saying something about the “No-Cloning Theorem”. Or simply declare the recording to be analog.
There’s also a pile of legal, moral, and theological questions that might be addressed in the story:
Is the AI considered to be the same person as its human predecessor or a digital twin? Is it a person at all? If an upload is a person, how different do copies of that upload have to be before they’re separate persons?
Is one copy responsible for the debts and/or crimes incurred or committed by another copy? Is the original responsible, assuming nondestructive uploading?
Assuming nondestructive, undetectable uploading, is uploading without consent of the original a crime? What if the original objects, but the upload doesn’t want to be deleted? What about uploading dead people who specified they didn’t want to be uploaded after death? And how do the original and the copy feel about no longer being unique?
Assuming destructive uploading, the original is dead. How does the copy feel about that?
What do you do with the backups of an upload who kills themselves?
Would the soul be copied over? Is there a soul at all to be copied? While some people might see the debunking of mind-body separation as just another case of science marching on, a great deal of people would find the idea that even their mind is quantifiable to be rather frightening. Or worse, would see those who go through with the upload as less than human, and campaign for a ban of the procedure for it violating human dignity or some other such reason.
Assuming the existence of the soul (or even just assuming the original believes he has one), how does he feel about the prospect that he may not be simply destroyed, but go on to an afterlife (pleasant or unpleasant) while a newly created double takes his place? After all, “he” stands a 50/50 chance of winding up as the original or the copy. For that matter, is the newborn copy innocent of sin despite his memories of committing them?
Even theorists who don’t believe in the soul, per se, often believe in consciousness as a real phenomenon. Would a simulation of a brain experience consciousness any more than a simulation of lungs can be said to actually respirate oxygen? How could an outside observer tell? note The fact that the observer probably can’t tell arguably makes this consideration more important, not less—since uploadees would be gambling their very selves on the trustworthiness of this tech.
Though fictional depictions of virtual worlds rarely address the fact, programs move by copying themselves. Any time “virtual you” moves from Server A to Server B, you’re leaving behind a duplicate of yourself, unless it’s automatically deleted. Might the constant duplication and murder of people as the basis of all transportation be unethical, or at least problematic?
If a scanned mind is an analog recording, the constant and casual re-copying necessary to “travel” electronically would be impossible without corrupting the data. You could copy yourself into a durable and long-lasting robot body relatively safely, but you could never safely leave it except by physically transplanting the robot’s brain. And of course, physical electronic components do wear out.
How accurate would the copy be, especially in the early days of the technology? If the flaws are significant but not immediately obvious, how many people might undergo the procedure before the problems are noticed? And if you know about the flaws ahead of time, how much of your personality or consciousness are you willing to throw away or see changed beyond your control for a type of immortality?
Even if the tech is usually reliable, do obviously botched copies have any legal rights as people?
If you have concerns about the trustworthiness of the process, what if everyone you know is doing it? Conversely, if you’re a true believer in the process, what if society condemns it?
Can the computer provide a good enough simulation of human sensory input to keep you from going mad? Even a brief period spent in a sensory deprivation tank can have terrible effects on the mind, so one can imagine what complete absence of a physical body might do.
A man converted into software has all the vulnerabilities of software. He can very likely be hacked, duplicated, or edited against his will. For better or for worse, the human mind is currently relatively impregnable. Do you really want to be rendered no more unique than a google search image, and more malleable than putty in the hands of others? Do you want to wake up one day to find that you’re an illegal copy of yourself, being treated as a toy by a hacker? Would you necessarily own the copyright to yourself? If such a copyright even existed at all (since many consider copyright unenforceable and undesirable in the digital age), would the agency that uploaded you own it? How can the law provide any protection to a citizen who can be duplicated (and his duplicate used and abused however the criminal wishes) as easily as copying a computer file? And every time such a copy is produced, “you” stand a 50/50 chance of being that unlucky tortured twin. If a virtual world makes a synthetic heaven possible, it likewise makes synthetic hells possible, and the latter may be far easier to produce (either accidentally or deliberately).
In a world where uniqueness exists, at best, as a legal courtesy, mightn’t human life come to be seen as fundamentally less valuable? What rights can a completely replaceable person have?
Widespread Brain Uploading tends to lead to The Singularity or something very much like it. Or it may be a result of said Singularity.
Compare with the Virtual Ghost, where the uploaded brain can control a projection of themselves to interact with the real world. Contrast Neural Implanting, where computer files are uploaded to the brain instead of the other way around, though both tropes are occasionally used together.