Sure... sorta. But there is a problem. You see, without continuous input from the environment in the form of highly corellated input... we know what happens to human consciousness. It rapidly dissolves and disappears. Consciousness is intimately and (thus far) inextricably bound up in its embodied existence. Facial paralysis has profound effects upon the ability of people to feel emotions 'internally', eventually resulting in both an inability to even recall ever feeling those emotions or having an ability to recognize them in others. Should a consciousness divorced from a feedback loop created with the same environment we share (or at least similar in most respects, a simulation might work OK, I don't think we know) be created, the odds that we could even recognize it as conscious are very low. Maybe a general measure of the systems tendency to reduce entropy in some region either within or near itself?
We are feedback loops, and when the loop is broken, we stop being us.
Most people would accept being able to see and interact with the world a prerequisite to consciousness. I understand this as an I/O problem. There must be continuous high-bandwidth input (visual, audio, tactical) as well as very detailed and dexterous output.
I wonder to what degree virtual realities can play the I/O function. I think for the next half century virtual realities will mostly operate at a lower level of detail then meatspace. Can a mind function well stuck in a lower detail world?
Alternatively, cybernetics could serve. You bring up a real concern, but it's a solvable problem.
> Most people would accept being able to see and interact with the world a prerequisite to consciousness.
People lacking sight, hearing, mobility, and similar would disagree.
By all means, we want the ability to interact with the world, bidirectionally. But that's not a prerequisite for consciousness. And if the only thing we manage, in the short term, is preservation and continued function, that would be a massive improvement, far larger and more important than the subsequent incremental steps towards I/O.
> People lacking sight, hearing, mobility, and similar would disagree.
I think "to see" is meant as a placeholder for any kind of sensory input. I doubt that people who lack any form of sensory input with no way to communicate would disagree even if they could -- while they are in that state. As somebody who experienced some knock-out after an accident, I'd say the consciousness is a fragile something (but uses any straw to re-establish itself).
> I doubt that people who lack any form of sensory input with no way to communicate would disagree even if they could -- while they are in that state.
If you're in a sensory deprivation tank, but you can still think, are you no longer conscious? If you were in a perfect one, or you had something that somehow cut off the connection between the brain and the body, but you can still think, are you no longer conscious? The ability to interact with the world certainly makes life nicer, but consciousness does not depend on that.
> If you were in a perfect one, or you had something that somehow cut off the connection between the brain and the body, but you can still think, are you no longer conscious?
I think, if such an unlikely scenario were possible, that disturbing insanity would result, and eventually it would degrade into something that does not resemble human consciousness.
The reality check provided by sensory input stabilizes and fosters consciousness.
Perhaps it is possible to design a conscious machine that would be more robust against sensory deprivation, but humans certainly don't have that characteristic. A good example of this is the very damaging effects of solitary isolation prison cells.
The situation is different since the body is still intact. You and those people are not talking about copying the whole body though but about copying just the brain -- or rather some information that was extracted from the lump of cells that constitute the organ "brain".
Consciousness does depend on it. You cannot mature a recognisably human consciousness without input from the world and from other humans.
And after you mature a human consciousness, you need to keep it busy with external stimuli, otherwise it becomes badly damaged - as anyone who has spent a long time in solitary will tell you.
If you cut off all stimuli for long enough, you’ll have something left in your tank, but it won’t be sane enough to be recognisably human.
You're iterating the IT inspired delusion of the brain being the cpu + memory and the body being the peripheral hardware. So why not just replace the signals from the peripheral hardware with some mock data as we do in unit tests, you ask.
For you VR world to work you'd have to raise the consciousness in the VR world.
I agree that I am expanding upon the metaphor that the brain is a computer and an individual is a program. It's no more a delusion then any other metaphor which only inaccurately describes the world. All metaphors can cause poor understanding when taken to far, although I agree this specific one is often used to jump to poor conclusions.
I'm not convinced that it is impossible to transplant consciousness from meatspace to VR. Am I missing some prerequisite reading to form this conclusion?
That certainly is how we evolved, but somehow we evolved a consciousness that isn't purely a function of its inputs; people who are "locked in" remain conscious.
People who are 'locked in' are not cut off from perceptive stimulus. They're simply cut off from being able to have their nervous system output result in manipulation of the world. I imagine this does have a significant effect upon their conscious experience, but it's not something that would be easy to study. Even studying how less significant damage to the body results in 'personal' changes can be difficult. Studies of victims of spinal cord damage which results in parapelegia and how that results in depression and reduced emotional range beyond what should be expected from the pain or trauma alone strikes some people as 'insulting'. Our adamant refusal to recognize that the 'brain' is nothing more than a convenient formalism used to talk about the entire nervous system and body and that every part is necessary to give rise to what we recognize as human consciousness runs very deep. Dualism (in the sense of a body/brain split) is wrong, but apparently far too alluring for many.
We are feedback loops, and when the loop is broken, we stop being us.