So he says that he doesn’t think consciousness is computational, but in the rest of the article pretty much everything he says about it is in terms of processing information.
“ In fact, the brain is always making predictions about what’s out there in the world or in the body. And using sensory signals to update those predictions. What we consciously experience is not a readout of the sensory data in a kind of outside-in direction. It’s the predictions themselves. It’s the brain’s best guess of what’s going on.”
So, processing information, then.
“ I argue that this entails that the brain is or has a predictive model of its own body, because prediction is very good for regulation.”
Sounds like information processing to me.
“ A conscious experience, typically for us humans, brings together a large amount of information about the world, from many different modalities at once—sight, sound, touch, taste, smell—in a single unified scene that immediately makes apparent what the organism should do next. That’s the primary function of consciousness—to guide the motivated behavior of the organism that maximizes its chances of staying alive.”
Do I have to say it again? Every single function he ascribes to consciousness consists or receiving and processing information, and making decisions.
We know the answer to this. Any Turing complete system is capable in principle of any information processing task. Whatever else brains are, we know they are information processing systems. Perceptions go in, and decisions come out. All of our conscious experience is of information. Feelings, sensations, emotions, decisions, they’re all information. What else is there?
Edit: On the weather. When I imagine a rain storm, nothing actually gets wet. So is that activity of imagination more like the weather, or more like a computer simulation of the weather?
This is why I am extremely skeptical of "anti-materialists" or whatever you want to call people like Seth: if conscious is not mere computation, what is your theory? Do you even have a theory? Why did I read a several hundred word interview with you about this topic and come away without any understanding of your alternative theory?
I am open to alternative theories on this topic but none seem to be clear or grounded enough to describe in a few clear paragraphs. To me, that is damning.
In my opinion, if consciousness turns out to be anti-material meaning after years and years we cannot replicate or find any model to explain it. Then we have to assume that it is contained outside of the "system" which could only mean that the world is built around consciousness and everything else is a dream or simulation.
However I don't actually believe this, just an idea. I think consciousness is probably much more simpler than we think, if you look at evolution it developed very quickly compared to other things.
I think for consciousness you need a sophisticated mental model of the world, but also a sophisticated mental model of other intentional agents and their mental processes. You need to be able to reason about the knowledge, beliefs and likely actions of others.
When this is generalised to enable modelling and reasoning about our own knowledge, beliefs and intentions, that’s consciousness. We literally become aware of ourselves in ways we can reason about.
I actually don’t think most living things, even animals, are conscious. Mammals and some other higher animals possibly.
Simple organisms have simple sense/response nervous systems. Their reactions are mostly automatic, but can learn basic patterns of stimuli.
> I think for consciousness you need a sophisticated mental model of the world, but also a sophisticated mental model of other intentional agents and their mental processes.
A random find[1] that I found interesting. Some quotes:
A growing set of experiments therefore appears to establish a key prediction of [Attention Schema Theory]: without consciousness of an item, attention on the item is still possible, but the control of attention with respect to that item almost entirely breaks down. The relationship is not “consciousness is attention”; instead, it is “consciousness is necessary for the control of attention.”
AST also predicts that people construct models of other people’s attention [...]. Ample evidence confirms that this is so.
Activity in at least some subregions of the [temporoparietal junction] has also been found in association with one’s own attention. Moreover, TPJ activity is associated with the interaction between attention and reported consciousness. A recent study argued that this activity is consistent with error correction of a predictive model of attention.
You could say the exact same thing about the "soul".
I just do not understand why we can't take the idea that consciousness doesn't exist seriously. The word practically has no meaning.
"We know consciousness exists because we are conscious" is as circular reasoning as it gets.
It is so strange to me that we can't even be bothered to explore if this question is the 21st century version of how many angels can dance on the head of a pin.
There has to be so many things that we currently believe to be true that are utter nonsense. When we believe something exists but we can't even define what the word means that is probably a good place to look for nonsense.
I think there's a legit line of inquiry in trying to develop a "top down" picture of reality rather than "bottom up". Eg instead of taking the physical laws and objects as fundamental and ourselves as something that can be constructed within it, what if we instead recognize that whatever argument you make is in fact made to convince myself and others, and therefore the existence of myself and other conscious entities is an irrefutable axiom of the argumentation based reasoning we use.
Given just conversational conscious entities and the words they pass around as the only a priori existent things, can we use these as tools to construct a picture of the familiar physical sort of reality. In other words, failing constructing consciousness from the bottom up, can we construct our picture of the physical from the top down?
I think it is possible to build up a rational case for an objective universe starting with just perception. Firstly if the only thing that exists is conscious awareness, where does the informational content of the world you perceive come from? It doesn't come from your awareness, because you are not aware of it until you perceive it. You can say it comes from the subconscious, but the subconscious is not part of your conscious awareness. It's external to it, in the same way that your hand is external to your conscious awareness. There has to be an origin for perceptions that is external to conscious awareness of those perceptions.
From there we observe that these perceptions are of a consistent and persistent form, so it’s rational to conclude that they have an origin in a consistent and persistent source. From there, and taking into account our ability to test our perceptions through action, we can build up knowledge about the world of our experiences.
As for trusting logic and rationality, does it give consistent and useful results? Test it and see if it continues to work reliably over time. If applying logic provides random, contradictory or unreliable results that’s a problem, but maybe you can correct that by modifying how you reason about things and trying again. That’s learning. So I think we do have the cognitive tools we need to build up a robust account of reality starting from base perception.
We didn’t start off human society and civilisation with scientific laws as our founding axioms. We inferred them from sense data, including the process of physically testing our ideas in the world.
It's such a cursed term that the thing some people mean by it does exist, like the ability to sense things around you, communicate on a human level, and have an operant short term memory of at least 5 minutes.
When I see someone use "consciousness", I'm assuming they mean something grander akin to a soul, like a special, immutable self that they feel like they possess, which I think is just embellishing an illusion the mind creates for itself. Some people do mean it that way, but some not.
It should be scrapped from our vocabulary since it just scrambles communication.
This is a fundamentally different situation, because it's very possible that consciousness is the only thing that we can or will ever experience directly. I agree that things like UFOs cannot be explained in that way, because those experiences are products of our sense-making, but consciousness is the act of sense-making itself which is hard to deny.
I don’t buy this at all. We have experiences. It’s the one brute fact we are sure of, but it’s enough to build from that a coherent materialist model of the world that is consistent and testable. Everything we think and know is built up from the foundation of our conscious experiences though.
I’m no solipsist, I’m a thoroughgoing materialist, but if we don’t accept the evidential nature of our conscious experiences as being real phenomena, we have nothing.
You could just reference the philosophical literature. Qualia is the technical term for sensations of color, sound, smell, taste, tactile sensations and any other bodily sensation. It's very strange for someone to talk about consciousness not existing. You don't experience color, sound or pain? What about inner dialog, imagination or dreams?
The soul has nothing to do with this in the modern philosophical debate.
Tend to agree. Consciousness is a social fact evidenced by it's plurality of grounding, single cultural origin, and memetic propagation. The idea itself is barely 500 years old. If it is a fundamental aspect of the human experience, why wouldn't it have origins stretching back to prehistory?
We have cultural artefacts of people reporting first person experiences going back as long as we have writing, and longer from oral traditions. These are in the form of narratives, poetry, songs, etc. What is the story of Narcissus about if not the immediate visceral stimulation of visual experience?
We can also infer it. I find it hard to see why cultures would produce representative art if they had no experience of seeing it. In a world without first person experience it’s hard for me to imagine what function representational art would have.
At the same time, we have philologists who agree that Descartes was the first to use "conscientia" in a way that doesn't match the historical use of conscientia and instead matches our use of the word "consciousness" and John Locke was the first to use of the word "consciousness" in the same way we use it today.
Additionally we can point to Jaynes' The Origin of Consciousness in the Breakdown of the Bicameral Mind as a theory that doesn't assume a historical consciousness (or at least in the same what).
So it seems like the argument is that consciousness didn’t exist because we didn’t have a word for it and the guy wrote a book and built a career with that as a premise.
Honestly, can I even be bothered? Oh, alright then, just for fun.
Yes I know I simplify above, but honestly not by all that much. There’s a history of the development of styles of literature. Sure. But that doesn’t mean the world, or people, changed because ways of describing or creating narratives about them changed.
At the root of this seems to be the idea that we can’t have a thing such a consciousness without the idea of the thing, and other examples given in evidence are baseball and money. This is flat out wrong.
Nobody sits down with a blank piece of paper and invents a new ball game from scratch. All the ball games we have were developed by playing with balls. You just have done this. A bunch of kids get together and start messing with a ball, they appropriate objects in their environment into the game such as sticks and posts and such, make up rules as they go. Over time things happen in the game they didn’t expect and they invent new rules to cover those circumstances. Eventually they have a great game they love, they give it a name and then free inventing the game they write down the rules.
It’s the same with money, it started off as tokens representing goods, then they established the lowest value object as the basic rate of exchange such as a chicken or a bag of grain, then they work out exchange rates between different tokens such as 5 chickens equals one sheep, then they mark the tokens with symbols, etc, etc. Eventually you get money. Nobody looked at a market and deduced you know what, we need to invent money. The concept emerges from the practice.
My wife has Aphantasia, which means she doesn’t subvocalise. She doesn’t hear an inner voice when she reads, and can’t imagine what that’s like. She also cannot imagine visual images, yet she reports the same first person experience the rest of us have. So the idea that language creates consciousness seems to fail right there. We also have reports from people that don’t learn language until late in childhood of first person conscious experience from before they learned language.
So I’m sorry, it’s bunk. There never was any actual evidence for it, the chain of reasoning inserts conclusions that don’t follow from its arguments, and there’s clear evidence that contradicts its assumptions.
There's also things that don't exist until people have words for them, where the very act of speaking it brings about its existence. It seems equally likely that consciousness wouldn't have been self-evident without someone being primed to perceive it as a category.
I’m my view I think the early emergence of consciousness says that it’s more complicated not less. My reasoning is that it must mean the framework for consciousness is embedded in the basic building blocks of life. If that’s the case then our understanding of how those basic building blocks work still has a long way to go. To make a physics analogy it’s like we have a good understanding of mechanics but haven’t yet learned what electromagnetism is.
I have not read the article but I would love to have a real debate with a materialist. All anti-meterial models will skew sightly mystical so if you have zero appetite for that I won’t bother. But I’d like to try to change your mind if you’re interested.
I’ll start: Consciousness has nothing to do with any sort of processing. Processing is “dead” in the sense of a desktop calculator. Consciousness is the “living” part of your experience - that which experiences. Any process is the content of experience, not the subject.
I’ll stop here for now.
EDIT: currently this comment is at a 0 score. If you disagree with me, please leave a reply and I will try to get to you, but don't blindly downvote me cause I'd like to talk to more people, that's all. I'm not some religious zealot, you can check my post history etc.
You are simply defining consciousness and information processing in terms of alive and dead. If those are your axiomatic assumptions, you’ve already established the outcome before evaluating the issues.
Consciousness may be more than processing information, but it is definitely processing information. Perceptions go in, and decisions come out. Along the way we also form memories.
When we experience something, qualia if you like, it matters to us. That process stimulates us, interests us and often it prompts us to action. Those are all information processes. Stimulation is a signal. Interest and attention is metadata about the experience. Being prompted to action is making a decision. These are all intrinsically processes on and about information.
I agree with everything you're saying, but I think we just are referring to different things when we say consciousness. Take any qualia, and there are 2 sides to it: The object of it, and the subject of it. A red light is seen. In the seeing of it, there is the red light that is seen, and the seer who sees it.
What you are talking about is a black-box model of consciousness that only relies on external measurement. That is the correct scientific approach, but it comes nowhere close to counsciousness. Consider that maybe you are doing the same thing you think I'm doing. You have defined experience purely in terms of inputs and outputs. The experience is the input, the outputs are perception, volition, will, action, etc. But who's are the perception, volition, will etc? There is something that exists that gives essence to all these.
If this doesn't ring true at all, we can stop here. Objective language can only get so far when talking about the subject, unfortunately. Thanks for taking the time to reply.
I don’t just believe it exists, I know it exists, just like you know it exists, because you are conscious, because you are. Are you aware of your own existence? How can you know anything if you don’t know the knower?
There words may not make any sense to you right now, and that’s the limitation of objective language: we cannot point at the subjective without metaphor. But if this does sound like it rings a bell, lmk and we can talk more.
The observation of our own consciousness, that is its self recursive nature while we’re in a conscious state, is what makes consciousness special. With other observations we receive a stimulus, we perceive it, and we either act on it, or memorise it or forget it.
While we are doing so, we are aware that we had that perception, however we are also aware of our awareness of the perception.
So the observable property of consciousness is awareness of ‘consciousness awareness’ itself. This is not a paradox, we know how recursive functions work from computer science and mathematics so the concept of the recursivity of a process is logically coherent.
This makes our own consciousness the most observed property in human experience.
There are some experiments that suggest that human actions are not the result of conscious thought. Maybe you've heard about Michael Gazzaniga's experiments on split brain subjects. The most radical interpretation of the results is that our conscious thought merely constructs a narrative about actions it has no direct control of.
In other words, our brains act like a bunch of interconnected neural networks and one of them, the language processing part, tries to make sense of what all the others are doing, serving the only purpose of communicating with others.
I wasn’t aware of that research, thanks, but I do agree most cognition is subconscious. When talking rapidly, the words seem to just spill out from nowhere. Ideas simply spring to mind. We have no direct awareness if the cognitive processes that generate them.
We can reason consciously step by step, but it’s an incredibly slow process. It seems likely that consciousness has something to do with attention management and broadcast communication across brain regions and functions.
Which is part of the reason why I think LLMs are a big deal. LLMs talk like the inner voice, and their failure modes are strikingly similar to the failure modes of "gut instinct" / first reactions and "stream of thoughts". I don't think this is a coincidence - I think we may have stumbled on the main trick that makes biological brains work. I don't mean language here, but rather the use of absurdly high-dimensional latent space for representing relatedness.
Now, if the above is anywhere close to the truth, then taking it together with the research you mention suggests that LLMs aren't simulating or parroting abstract thinking and understanding - they're actually doing it, the same way we do. They just lack an evaluator/censor layer of conscious experience.
You've already fallen into several pits of failure, perhaps without realising it.
Firstly, "alive" and "dead" are very squishy words with no clear definition. You can mould them into whatever you please, but that's not a firm foundation for any kind of debate.
Second, you can't just state your position as fact. You say that "Consciousness has nothing to do with any sort of processing." That's not an argument! You're making an assumption, and treating it as an obvious thing we all agree on. We don't. In fact, it's the main point of disagreement. This is like a theologian assuming the Bible is true because God wrote it, but Atheists don't believe God exists.
"Any process is the content of experience, not the subject." -- this is too vague to respond to. It could mean anything.
Other than hand-wavey, vague things, you've only said one clear thing, which is to assume the outcome that you would like.
Materialists aren't just making assumptions. Tests have been done. Real, scientific experiments. We can (trivially!) alter consciousness through chemical, surgical, or electromagnetic means. Simple physical processes seem to underlie consciousness, and altering those processes alters consciousness.
Every teenager that has taken LSD or mushrooms can attest to this fact. Anyone that has taken a nasty hit to head can experience this first hand.
I admit my language is squishy. Sometimes it helps point to what I'm trying to point to. And I know where you are coming from, as someone who was a materialist for much of my life. We are simply talking about different things.
You are talking about the contents of consciousness - the things that get altered by physical processes. We don't need tests to establish that. When the sun rises, the contents of my consciousness change from black to orange, thus we know that physical processes can change the contents of my awareness.
I am taking about consciousness itself. When the sun rises, the sky is the same, even it now seems orange.
> When the sun rises, the sky is the same, even it now seems orange.
I don't understand what point you're trying to make. When the sun rises, the colour of the sky changes in ways measurable with physical devices and noticeable to animal and human eyes. It is not "the same" in any sense. Different photons traverse it, its temperature changes, ionisation of its gas molecules change, etc...
Yeah, but "it" is the same sky. The presence or absence of particles does not alter the sky itself. More accurately, space is independent of it's contents. Mind you, this is only a metaphor. Consciousness is not physical space. Just like physical space is the "space" of physical objects; Consciousness is the "space" of perception.
You share almost nothing in common with who you were when you were 5 years old. Even if all the cells in your body have been replaced, and all your thoughts and your way of thinking itself has changed, you are still the same person. It's the same consciousness. That continuity is you. You are that consciousness. Do you see it?
That’s the Ship of Theseus argument. I think there’s a reasonable argument that in fact we don’t have continuity of consciousness. Yes we are consistent persistent beings up to a point, but inky in the way that the ship of Theseus continues to be the same ship.
I think of consciousness as being an activity. It’s something we sometimes do. We don’t do it in deep sleep, or when anaesthetised. But when we wake up is it the same consciousness? We remember being conscious previously, and that memory accords with what we experience ‘now’, but is that the same thing?
I think it depends how you look at it. I drove my car yesterday. If I get in it today and start the engine, is it the same ‘running of the engine’ as yesterday?
> I think there’s a reasonable argument that in fact we don’t have continuity of consciousness.
One argument that just came to me now: we already know we don't have continuity of vision - we go blind during saccades. We just don't notice, because the brain is happy to extrapolate from the last visual inputs, or otherwise completely make up what we see, going as far as screwing with our perception of time to fake continuity of experience.
The same could be true for consciousness itself. Visual system is a proof that the brain can and does aggressively smoothen out discontinuous experiences.
> I think it depends how you look at it. I drove my car yesterday. If I get in it today and start the engine, is it the same ‘running of the engine’ as yesterday?
Exactly. It's a different "running of the engine" for the purposes of your typical conversation and thinking. It's a different "running" in terms of battery charge maintenance. It's the same "running" for general car maintenance. So, different/different/same. If you're on a trip, your engine dies, and you immediately restart it, it's same/different/same. If you drive your car after having it sit a month in a garage, it's different/different/different.
When we sleep or get knocked out, we are aware that we've been unconscious for a period of time. But there could be other cases where we effectively lose consciousness for shorter periods of time - seconds perhaps - and the brain produces illusion of continuity, in the same way we get blind during saccades and the brain masks it out from our awareness by synthesizing data and screwing with the clock.
> When we sleep or get knocked out, we are aware that we've been unconscious for a period of time.
Not always, at least not in the ways that matter.
When I was a teenager, there was one occasion when I went to bed at night, pulled the bedsheets over my head, and with no subjective pause or discernible change of subjective internal experience of my mental state everything became bright, I pulled the bedsheets away from my face, and it was morning.
Take one more step back. When you are in deep sleep, your mind is absent, and so are all sensations, but what is it that registers these breaks-in-mind-continuity? You are that host, that reality, that being that never sleeps.
In the sense that you are asking, I can't answer this question, because I don't know. Perhaps ants and up have it? idk, there is no way to know. There is no way to even know if other people definitely HAVE consciousness, I take it on faith. I can only KNOW for sure that I have consciousness. How do I know that? Because I am conscious of it. Is this circular? Yes. In fact, this is the ONLY absolute reality that I can positively assert, without reference to any other fact or axiom. The fact that I can think, or that I have a body, or that it's day outside, all of these first need me to be conscious. So, yes, it's circular, and it's true, and you know it's true that you are conscious. And you don't need process any information to know that you are conscious. It's prior to all that.
> Consciousness is the “living” part of your experience - that which experiences.
That's circular, what is experience, what is "that which"? I propose a simple definition - consciousness is what informs actions, and optimises actions to benefit the agent.
That's a concrete definition, without subjective elements.
It's all in there. Consciousness decides actions, it's embedded in every action. Actions change the future timeline, there is consequence in actions, and they provide feedback. It feels like something because it matters how you react. You could die if you don't react correctly, for example.
I have some appetite for it. Either conscious is mere computation or it is not, and either branch should be explored. I think assuming that either branch is correct beyond a reasonable doubt is wrong.
Is there any material (!) you can link me to to explore further?
Look, there is more material about this stuff that perhaps any other subject explored by man, but it frequently comes in ecclesiastic garb and that may or may not be to your taste. I'll can link both spiritual and secular starting points. The spiritual ones are far more delicious. What's your stand on religion in general? I'll link sources appropriately.
OK, but when a materialist says "it's just computation", that's not a theory either. It may be the first step of a theory, but it's only one step. Where's the rest of the theory?
Nobody has a real theory of consciousness, still less an experimentally-verifiable theory. So it seems a bit unfair to exclude anti- (or at least non-) materialists for not having a theory.
Now, you could argue that materialists are at least on the road to a theory, and the anti-materialists are not. On the other hand, if the anti-materialists are right, then the materialists are not on a road to any correct theory, ever. (Being further advanced down the wrong road is not a virtue, except to the degree that you learn more quickly that you need to turn around...)
> it seems a bit unfair to exclude anti- (or at least non-) materialists for not having a theory.
I do not exclude them on the basis of not having (what you consider to be) a theory. They have a theory: consciousness has a non-physical dependency. The materialists theorize that consciousness is merely physical.
The issue I have is that non-materialists seem to be bad at conveying this theory. Just claim belief in the non-physical upfront. The chess vs weather comparison is a terrible analogy. They seem to lack the courage of their conviction and in turn dress up a simple concept.
All of them are proven computational? Mind laying out your evidence for that statement? Because it sounds like what you have is a presupposition, not evidence.
I note that, in your reply to GMoromisato, you asked for evidence. I ask you to meet the standard you expect of others.
I wouldn’t say they are proven computational, but I would say they can be explained in terms of processes on information. Sense data, body awareness, perceptions, noticing things, evaluating situations and options all seem to me to relate to information and processes on it. We use these experiences to form memories and make decisions. Both these are processes on information. Those are all things that consciousness actually does. Is there any function of consciousness we’re aware of that doesn’t relate to information?
That leaves the experience itself. It’s an experience of informational content, but is there more going on than transformations of information? What could that be? Maybe a physical substance, or a physical structure, in which case we should be able to physically synthesis consciousness stuff, or assemble a consciousness structure. What else is there?
Re. your second paragraph: Like others in this thread, you're assuming the answer. Or rather, you're assuming the presupposition. And that's fine, if you want to do that. But be aware that you're doing it, and that it's unproven. (And maybe be aware how much you're doing it.)
> Is there any function of consciousness we’re aware of that doesn’t relate to information?
Maybe awareness of your own emotional state? (Though I'm sure that there are views where that also is information processing, and nothing more.)
Pain (and related feelings) cannot be definitively reduced to computation. I’m open (and even predisposed) to believing it is just computation, but it’s not intuitively obvious.
We don’t even know how to test whether an arbitrary system feels pain.
Whatever else is going on inside our brains, there's definitely information processing happening and we also know that we have a first person experience of it. Our perceptions are informational, they are experiences of data. Colour, feelings, pain, desires. These are all informational and we use them to make decisions. So whatever consciousness is, what it does is process information.
If consciousness itself is not a process on information, then what is it? Another possibility is that it is a substance. Chess is a set of relationships between symbols, but the wetness of a rainstorm is physical water. So is consciousness an actual substance? Where does it go when we don't have it? Is it a chemical that forms at certain times, and is re-combined into another chemical when we lose consciousness? In which case we could synthesis a beaker full of consciousness fluid, or whatever it is.
Dualists say it's a non-material 'substance', but I don't think that's a coherent concept. It must interact with matter, or it couldn't receive sensations and cause effects when we make decisions. But how can it interact with matter and be non-material? If it does interact with matter, then it can be detected and manipulated physically. So why can't we find it? We've been looking at brain chemistry, signals and processes for a while and not found a hint of such a thing. Any other ideas?
I meant that pain cannot currently be reduced to computation. I’m open to believing it’s just computation, but it’s not obvious that it is, unless you assume that everything the brain does is. But then you’re the one assuming.
Like he argues that there's a big difference between software running on hardware and our brains, in that the neurons are dynamic, bathed in chemicals etc while software doesn't interact with the hardware in any substantial way (ie doesn't physically change it).
This is true, but to me it seems much more like optimizations rather than a truly fundamental difference. Surely you could model a neuron in software with non-trivial internal state and which also responds differently based on "ambient parameters" (ala bathed in chemicals).
It would be much harder to train, at least using the methods we do now, but I fail to see a fundamental difference.
>Surely you could model a neuron in software with non-trivial internal state and which also responds differently based on "ambient parameters" (ala bathed in chemicals).
Oh you could. and it's been done. But it's much harder to train and seemingly for no real performance benefit other than "like the brain". Backpropagation is just really efficient.
Bahaha! I always suspected hackernews was all bots and this comment thread confirms it. Just because you don't have a subjective experience doesn't mean us real humans don't! my chatgpt powered pal!
Sorry I didn't know about /s.
I don't judge one way or another. Subject hallucinating the material world, or object hallucinating a subjective experience. I love you all.
His book referenced in the article is excellent. I listened to the audiobook.
I think you might actually be satisfied by his treatment of everything you just brought up in the book, but it's too difficult for me to attempt in an internet comment.
There are some interesting lines of discussion around near death experiences (NDEs) where the individual at hand supposedly did not have brain activity but was conscious.
I can't speak to the veracity of claims, but that and the surrounding area do have some people who seemed to be pretty woo-resistant go through this transformative experience, then spend time and energy trying to convince people of its existence. It is intriguing on that level alone and I've certainly thought some about it.
I know people have these experiences, the question is what do they mean. We know humans can hallucinate in various neurological states. It seems to me reasonable to suppose they are hallucinatory, unless we see evidence otherwise.
That is just a statement of what they hope to study and what they think they might find. It’s not an actual study and they have no evidence for the speculative goals.
Conversely I linked to a case of an actual EEG recording of a dying patient and what it shows.
I linked to the main homepage. My goal isn't to debate, but to foster conversation. Building a case in either direction is going to take more than a single patient study discussed on a healthline page or even a metastudy of studies with large cohorts (somehow). It's a hard topic. Hence the encouragement towards exploration. It's a very fun topic.
If you're interested in some further rabbitholing, that department has a very interesting and storied history if you have the interest and take your time to do a bit of homework on it...
Perceptions don’t go in. Perceptions are created by the nerves in the sensory organs and corresponding brain regions in response to stimuli. One can think of the brain as an information generator, making sense of the environmental noise that bombards the body. Seth claims that perception is a sort of predictive hallucination based on the kind of bodies we have. Moreover, We don’t perceive the world as it is. He’s an indirect or embodied realist.
In the brain there's a region surrounded by a layer of filter-neurons: they choose what signals to admit, using the rule of thumb that a repeating signal is worth attention. When they block all signals, the subject is said to be unconscious. When they tune in to a particular signal, the subject is doing something consciously. In this regard consciousnesses can be defined as simple attention.
No, because that doesn’t get you to color, sound, pain, etc since the signal is just electrical. Something more has to turn attention into experience. Also because simple attention can happen without us being consciously aware, such as day dreaming while driving down a highway.
I would state it as: The content of your conscious experience is the result of information processing in the brain. Information processing alone does not create consciousness, but the output of this information processing defines what we experience.
An analogy - The information processing in your computer defines what is displayed on your monitor, but the information processing alone does not physically power on your monitor and control the individual pixels.
I don't think it is accurate to think of it as the "output" of information processing, rather I think it is the actual act of diverse, yet integrated information processing.
Simply put, different parts of the brain do different things; there is some level of isolation of function, e.g. Broca's area for speech, the visual cortex for sight, etc.
Conscious experience clearly relies on several disparate parts of the brain (e.g. sight, sound, touch and smell can form a cohesive experience).
The integration of these parts is the act of consciousness, or put in another way, when the brain is highly integrated in a waking state of perception, it is conscious.
That's what the "P Zombie" thought experiment misses, imo. We are not some Rube Goldberg machine of pulleys and levers, as the analogy would lead us to believe. We are also not a series of pulleys and levers with some qualia painted on top.
We are, instead, the global process that arises when a biological agent's neural system is highly integrated with the goal of providing the organism a sense of self and direction.
It sounds pretty grandiose, but it is still fundamentally empty, much like the a system of pulleys and levers. Where do your thoughts come from? What do you actually do? Not much.
> I don't think it is accurate to think of it as the "output" of information processing
Don't put too much emphasis on my use of the word "output". I simply argue that information processing alone is not what creates/activates/whatever consciousness. I recall an episode of Star Trek in which the Enterprise computer emerges consciousness merely because it is a sufficiently complex system, but that is akin to magic to me. I reject the idea that a computer will magically become conscious simply because it's running some sufficiently complex AI model.
> The integration of these parts is the act of consciousness, or put in another way, when the brain is highly integrated in a waking state of perception, it is conscious.
This might explain the role of consciousness and why evolution favored it, but it simply doesn't explain fundamentally how consciousness emerges from neural activity in the brain.
Consciousness does not emerge; that's a trick your brain plays on you.
Really your conscious experience is quite empty/vacuous.
Let's take vision, for example. (this is based on reading a handful of papers without a background in neurophysiology, so mountain of salt etc)
Your vision isn't some continuous, perfect experience. Meaning, if you look at an old smartphone, you can really see the pixels, and you can see the jagged breakdown of the UI if you kinda squint and look close. This reveals the magic trick that it's really just a grid of pixels.
Same with your conscious experience!
Your vision, if you pay attention to the sensory content rather than what it represents, is grainy. It has constituent "pixels".
What are those? This is where I am guessing, because sure we have retinas, which would be an obvious choice, but there are also retinotopic maps in the back of your head in the visual cortex (https://elifesciences.org/articles/40224).
These create essentially a visual homunculus of your first person POV. It is entirely made up of cells, that get more dense as they get closer to mapping the fovea.
Back to my early point about vision being grainy. The point there is that vision has artifacts of it's medium. It isn't some perfect magical substance that gives a continuous, smooth qualitative experience, rather it is grainy, and likely a physical configuration of neurons.
What neurons?
Well, for vision, I think a conscious person's first person POV is identical to the retinotopic map.
When you look around, those neurons are changing.
If someone asked, in physical space, where my first person POV, made up "qualia", really is? I think you would have to point to this retinotopic map in the back of my head. There is nothing additional needed. Neurons can seem to be a 3-D a video game experience if configured in this convoluted way in a biological organism's nervous system.
TLDR: Those grainy movements of your vision? Those are neurons dancing. Your first person, visual perspective is entirely physical.
EDIT:
If they further inquired: well, okay, those neurons are identical to the first person POV that this biological organism has. But why is it appearing to them? Why aren't they a P-Zombie?
Well this neural homunculus (retinotopic map) is in the visual cortex which is connected to the parietal cortex (involved in spatial awareness and attention) and the temporal cortex (involved in object recognition and memory).
So whatever is happening, is a coordination of this visual homunculus with all the other processes corresponding to our other features/senses, with the goal of having a cohesive experience for an agent, you.
So these neurons are made up of a larger global process, and thus you can sit and look at this screen, but also hear yourself reading this sentence in your head as you vocal chords have minute movement, and then stop and think of something. This kind of ties back to my first comment, but the point I'm making here is if you phenomenologically investigate one of the most salient soups of qualia, vision, it becomes pretty clearly it is entirely physical. So the onus falls back onto you, what extra stuff is there?
I still don't see where color comes from in this global process you've described. Grainy or not, we still experience color. And color isn't something that rides in on photons, and then hops onto electrons which are sent to the visual cortex. Instead, the process of visual perception has to color in the world somehow. Mental paint is one way of putting it.
> Really your conscious experience is quite empty/vacuous.
I don't know about that. Maybe you need to elaborate more, but the fact that we are aware of our own consciousness and can practice introspection and speak of concepts like qualia suggests to me that consciousness is not completely empty/vacuous.
> Back to my early point about vision being grainy. The point there is that vision has artifacts of it's medium. It isn't some perfect magical substance that gives a continuous, smooth qualitative experience, rather it is grainy, and likely a physical configuration of neurons.
I fully agree with this statement. I'm well aware that our perception of reality is highly distorted/skewed and sensory information is heavily processed in the brain before it enters our sensory consciousness.
> ... it becomes pretty clearly it is entirely physical.
It's very plausible that the organization of neurons in the brain relates to our conscious experience. However, there are so many problems with the "physical" interpretation of consciousness. The entire brain is "physical" and very active yet only parts are conscious and others subconscious.
A fairly recent study challenges the idea of split brain consciousness, suggesting there may be something separate allowing the two hemispheres of the brain to communicate despite being separated by a corpus callosotomy.
A very interesting case study, twins joined at the head, their physical brains strongly connected. They can control each other's limbs, "see through each other's eyes", and even know each other's thoughts. Yet despite that, each girl still has a mind of their own. Two separate consciousness, where is the physical barrier?
Saying consciousness is "entirely physical" is a rather meaningless statement when we don't even know what "physical" is. Nobody really knows what is the true objective nature of reality.
There was some interview where a smart dude basically said we became enamored with the scientific method and created these strict, quantitative laws/rules/measures. Now we are trying to squeeze reality into these derivations, which is backwards.
Are there fundamental phenomenon that undergird consciousness that we are entirely unaware of? Yeah probably.
> Whatever else brains are, we know they are information processing systems. Perceptions go in, and decisions come out. All of our conscious experience is of information. Feelings, sensations, emotions, decisions, they’re all information. What else is there?
Whatever else your brain is, I know that it is a processing system for what I know about it. What I know about your perceptions goes in, what I know about your decisions comes out. All of your conscious experience is of my information. Your feelings, your emotions, your sensations, your decisions, they're all things that I can know about and understand in relation to the things I can know about. What else is there?
So, there is no point in you being so outraged at the article you have read: your "actual" outrage, if there is such a thing, is in fact superfluous to my information that you appear to be outraged by it, which is functionally equivalent. And therefore we can dispense with the illusion that you actually exist.
After all, when I imagine a you, nothing actually flaps its mouth, so is your activity more like the weather, or more like a simulation of the weather?
That’s fine. Yes, you read my post, which consists of information, and formed a mental model of me and my opinions. That mental model is fundamentally informational.
You're conflating me and your model of me though. The model isn’t me, it doesn’t have thoughts or feelings because… and this us the crucial part… the model of me in your head doesn’t process information the way a conscious being does, whereas I do. Therefore I have experiences, whereas it does not.
You don't see where you missed it so I don't see why I should admit where I did.
I couldn't possibly have conflated you with my model with you, you don't exist and my model of you does. There is nothing beyond the model according to you, so it's all nicely tied with a bow, no?
Of course there is a reality beyond the model. Our sense data comes from somewhere, it’s a model of something. I’m not sure quite what you’re talking about.
I'm talking about the deep contradiction that you hold between “there is a reality beyond the model” (Anil Seth’s position, the mind is more like the weather) and your other stated position, that the mind is reducible to its information-processing role in generating your “decisions”—which is to say, if we take it to its proper conclusion, that consciousness describes a certain way things behave in the world: some specific subset of this behavior of “take in information and process it and make a decision” is conscious behavior, and that's all the mind is, a certain sort of behavior.
Now I have put it to you that in fact my “mental model” of you in my head has the exact same behavior as you, just with respect to other mental things in my head: and therefore we can dispense with the illusion that you really exist. Indeed if we are looking at behavior the way you want to, then Hamlet is conscious. That is, when someone plays Hamlet, we see Hamlet process his information which comes from an external world of medieval alt-universe Denmark, and we see him make his decisions that ultimately lead to his death: Hamlet is a valid black-box to describe with these information processing tools; he receives the information from his father's ghost that his uncle is a murderer (that information had to come from somewhere!) and he decides to collect more information under the guise of depairing madness and pursue revenge. Information comes in, leading to decisions, those decisions lead to more information, which leads to more decisions. And in the “Sorry I had to” comment all I am proposing is that you have the same level of reality which Hamlet has.
The historical response of behaviorism was at first to retreat to saying that you also had to describe what behavior would happen in other possible cases, so if I shoot Hamlet then the actor will likely break character, stuff like that... This misses the fact that properly speaking I am not part of Hamlet's external reality in medieval Denmark and someone from that reality could shoot Hamlet and he would react appropriately. But for various other reasons folks abandoned behaviorism for a more sophisticated form of all of this called functionalism, where we abandon the simple notion of your original comment that it's all about information processing and external decisions at the “macro” scale, and try to save the theory by dropping to the “micro” scale. If you are interested in more of these ideas as a beginner The Teaching Company (“Wondrium” now?) had a series on Philosophy of Mind, preview it at https://youtu.be/_6sU3BkS4-A and if you like it maybe purchase the course—I can't vouch for the whole thing having not listened to it, but I have watched several talks by the lecturer and he always has a fun, common sense, jokey-but-serious style.
Sensations are not just information. A number is information. There's something that it's like to experience a sensation. There's nothing that it's like to be a number.
Of course, there’s nothing it’s like to be data. Un-recalled memories are data, they only become experiences again when we recall them into our active minds.
I don’t think consciousness is data, I think it’s a highly sophisticated process on data. It’s an activity, it’s something we do. Anil Seth described this process very well in the interview. It’s the third and last quote from him in my top level comment. Also I described my full account of consciousness in a reply to tekni5.
More seriously, I know how much philosophers love their “what is it like to be …” phrasing but there are a host of embedded assumptions which are seldom even acknowledged, let alone examined or defended.
I have a perception. Light stimulates my optic nerve, that sends a signal to my brain, and I experience seeing a red Apple. The signal is information. There’s nothing inherently red about it, if you looked in the nerve it wouldn’t turn red, it’s nerve impulse data being transmitted to my brain.
Let’s suppose seeing red is not a process on information, it’s something else. What? Some other kind of process? In order for my experience of seeing red to not be an a process on information, that data would need to be transformed into something that is not data and participate in this other activity. What is that?
After whatever that is happens, I have seen red. I have a memory of it, and can make a decision based in that experience. So now we’re back to storing and processing information again. Whatever happened in between, it took information in and sent information out.
What is the "I" that is "experiencing" seeing an apple? This all seems to just succumb to dualism again. We're not seeing our seeing. We're seeing.
> Let’s suppose seeing red is not a process on information, it’s something else. What? Some other kind of process? In order for my experience of seeing red to not be an a process on information, that data would need to be transformed into something that is not data and participate in this other activity. What is that?
If consciousness and the brain was only computational your argument would make sense, but many if not all neuroscientists would baulk at the claim that the brain is a computer or is fundamentally computational. It's far from deterministic and the "processing" is not as set in stone as a computer. There is no code or programming; there is no clear functional relationship like some neuro-lambda calculus. The "processing" of information (if that is how you want to describe neurological activity, most of which we barely understand) is not just more complex than computation --- it's weird and not straight forward at all. We can simplify it, and make some general guesses and theories, by adopting a simplified computational model, but it remains a model.
> If consciousness and the brain was only computational your argument would make sense, but many if not all neuroscientists would baulk at the claim that the brain is a computer or is fundamentally computational.
That's okay. They're allowed to be wrong. They're even allowed to be complete idiots.
The fact remains that the brains IS a computer, and we know this by the simple observation that it computes. The question is whether it is anything else.
You are though. The brain does a lot of hierarchical prediction with sense data. When new information comes in, it makes predictions and then adjusts the sense data likewise.
That's why when you shift your eyes quickly, you see blurred images pass by. In reality, you should be seeing complete black because the brain doesn't actually process visual information that shifts so quickly.
But your brain "knows" it should see...well something. And so it fits that blurred passthrough as compensation. Completely made up data. But not ungrounded data, data that seems like it should fit according to its prediction.
You are not seeing reality as is. You are seeing a nicely packaged, fluid version of reality (even fabricated at parts) that is being distorted without any input from the conscious self.
Computation is more than silicon chips, more than digital circuits, or the Von Neumann architecture. It’s not inhibited by the code/data distinction. It’s a general model of information processing encompassing any conceivable physical implementation.
Fundamentally information is a state description of the attributes, interactions and geometrical relationships in physical systems. Regions of magnetism on a tape, holes punched in a card, patterns of electrical charge in transistors, the distribution of electro chemicals in nerve cells and synapses. It’s all just information. All physical processes are processes on information. From the perspective of information theory, it doesn’t matter.
If his theory boils down to an acceptance of qualia as evidence of non-physical (extra-physical, super-natural, dualist, whatever) basis for consciousness, then a) I am open to it but b) he has done a terrible job conveying this idea clearly.
He seems to be saying that it is a physical process, but that it is not computational. My critique was that he does do while describing it entirely in terms of processing information.
It's like neither of those things. Consciousness isn't even a real thing. It's an awkwardly ill-defined idea we made up that has no basis in science, like gender, or reality. It's like asking if Mickey Mouse is more like the sea or a watermelon.
Consciousness is simply the awareness of being a person, and of having experiences. Do you not have that? I mean you don’t always have it, sometimes you are asleep or maybe sedated, but right now (whenever that is for you) as you are reading this?
Actually it's not, the definition varies and is difficult to interpret. But also, how do you know you're asleep or awake? How do you know you're even a real person and not a simulation? Can a simulation be conscious? Etc.
> here’s this assumption in and around the AI community that, as AI gets smarter, at some point, maybe the point of general AI—the point at which AI becomes as intelligent as a human being—suddenly the lights come on for the system, and it’s aware. It doesn’t just do things, it feels things.... There may be some forms of intelligence that require consciousness in humans and other animals. But fundamentally they’re different things. This assumption that consciousness just comes along for the ride is mistaken.
Yes.
And yet the pseudo-scientific game theory analysis of Yudkowsky etc never even contemplates this (as well as so many other possibilities outside their toy analysis). It remains astonishing to me that anyone takes them seriously.
I think for consciousness you need a sophisticated mental model of the world, but also a sophisticated mental model of other intentional agents and their mental processes. You need to be able to reason about the knowledge, beliefs and likely actions of others.
When this is generalised to enable modelling and reasoning about our own knowledge, beliefs and intentions, that’s consciousness. We literally become aware of ourselves, our intentions and thought processes recursively in ways we can reason about. So consciousness is a process, an activity. It’s a thing we sometimes do.
I actually don’t think most living things, even animals, are conscious. Mammals and some other higher animals possibly.
Simple organisms have simple sense/response nervous systems. Their reactions are mostly automatic, but can learn basic patterns of stimuli.
“ In fact, the brain is always making predictions about what’s out there in the world or in the body. And using sensory signals to update those predictions. What we consciously experience is not a readout of the sensory data in a kind of outside-in direction. It’s the predictions themselves. It’s the brain’s best guess of what’s going on.”
So, processing information, then.
“ I argue that this entails that the brain is or has a predictive model of its own body, because prediction is very good for regulation.”
Sounds like information processing to me.
“ A conscious experience, typically for us humans, brings together a large amount of information about the world, from many different modalities at once—sight, sound, touch, taste, smell—in a single unified scene that immediately makes apparent what the organism should do next. That’s the primary function of consciousness—to guide the motivated behavior of the organism that maximizes its chances of staying alive.”
Do I have to say it again? Every single function he ascribes to consciousness consists or receiving and processing information, and making decisions.
We know the answer to this. Any Turing complete system is capable in principle of any information processing task. Whatever else brains are, we know they are information processing systems. Perceptions go in, and decisions come out. All of our conscious experience is of information. Feelings, sensations, emotions, decisions, they’re all information. What else is there?
Edit: On the weather. When I imagine a rain storm, nothing actually gets wet. So is that activity of imagination more like the weather, or more like a computer simulation of the weather?