I’ve long been fond of panpsychism, but I think it’s less a hypothesis to be “proven” and more just a different way of framing the questions behind what consciousness is and how it can be defined. Under panpsychism consciousness isn’t a binary property that some things have and other things don’t, it’s a continuum from zero to one (and if you count humans as “1” on the consciousness scale it also makes sense to consider values above that - there’s no reason to assume that humans are the “most conscious possible” state of being).
So when you’re reading about panpsychism and it says something like “individual electrons are conscious”, bear in mind that they’re proposing considering electrons to be, like, 10^-10 “consciousness units” worth of conscious. It’s not like they’re actually aware of themselves in some meaningful way like humans are. That’s a common “giggle factor” problem for panpsychism. And it’s also not saying that any arbitrary larger-scale structure us “more conscious” than humans, the way that the components of a large-scale structure interact is super important. A rock is not equivalently as “conscious” as a human brain even if they have the same number of particles interacting within them.
I think the real issue is with the fact that consciousness is not particularly well defined. Something can be more or less conscious than something else but what precisely does that mean? Has there ever been a means of measuring or detecting consciousness in anything?
That’s my biggest frustration with this debate. At this point I’m convinced that consciousness is only a construct. Not a tangible entity, process, or concept, just a useful way to describe behavior. If someone describes the universe as conscious that’s neat and all, but it doesn’t really mean anything yet. And another person could say it isn’t and neither would be right or wrong, because what the hell is consciousness? Like you said, how are we supposed to measure this when we don’t know what it is? Many people think we haven’t discovered what consciousness is; I believe we haven’t decided what it is.
Depends on who you ask I think. Emergentism makes more sense to me because if you take consciousness as humans experience it, make it derivative of material structure (neurological activity), and assume the appearance of some kind of uniformity as synthesis of different parts of that neurological system, the only way consciousness may exist in that framing is in organisms that posses a nervous system.
This does inevitably leads to the problem of where to draw the line on the complexity necessary to qualify as consciousness, and im.not gonna pretend like I have the answer to that, but at least it becomes more of a scientific question rather than purely philosophical I think.
I prefer to consider it in terms of “dimensions of awareness”. Humans have evolved hundreds, possibly thousands, of interlinked dimensions of awareness for just about everything from colors to body language. Simple automated systems with sensors have their own dimensions of awareness, from vision to heat to pressure. Whatever it is that they track and respond to. AI, however, is finally hitting the point where these dimensions of awareness are being stacked and linked together (GPT5 can see, hear, read, and respond) and it’s only a matter of time and agency (aka executive functioning) before we see true AI consciousness.
I had a similar thought recently actually, that consciousness is more than the brain. Is gt4 conscious? Eh, I don’t believe anyone knows what that means but is it comparable to human consciousness? I don’t think so, but how could it be? It senses words, so it knows words, so it speaks words.
I hear it said all the time that llm’s don’t really understand what they’re talking about, but they seem to understand as well as they can given the dimensions they are aware of, using your terminology. I mean how can I describe anything myself without sensory details? It sounds like. It looks like. It feels like. It behaves like. We got all that knowledge by sensing, then infering. There’s no special sauce that creates understanding from nothing.
I don’t have any links but imo the experiences of people who were born without a sense, and especially those who were later able to gain it back, strongly supports this idea that something can only be conceptualized in the terms that it was sensed in.
Well, hypothetically, if someone defined the “consciousness” of every particle mathematically, and then figured out the laws that would allow us to compute (or at least approximate) the “consciousness” of a composite system (such as a brain), then we’d would have a genuine scientific theory.
I’ve long been fond of panpsychism, but I think it’s less a hypothesis to be “proven” and more just a different way of framing the questions behind what consciousness is and how it can be defined. Under panpsychism consciousness isn’t a binary property that some things have and other things don’t, it’s a continuum from zero to one (and if you count humans as “1” on the consciousness scale it also makes sense to consider values above that - there’s no reason to assume that humans are the “most conscious possible” state of being).
So when you’re reading about panpsychism and it says something like “individual electrons are conscious”, bear in mind that they’re proposing considering electrons to be, like, 10^-10 “consciousness units” worth of conscious. It’s not like they’re actually aware of themselves in some meaningful way like humans are. That’s a common “giggle factor” problem for panpsychism. And it’s also not saying that any arbitrary larger-scale structure us “more conscious” than humans, the way that the components of a large-scale structure interact is super important. A rock is not equivalently as “conscious” as a human brain even if they have the same number of particles interacting within them.
I think the real issue is with the fact that consciousness is not particularly well defined. Something can be more or less conscious than something else but what precisely does that mean? Has there ever been a means of measuring or detecting consciousness in anything?
That’s my biggest frustration with this debate. At this point I’m convinced that consciousness is only a construct. Not a tangible entity, process, or concept, just a useful way to describe behavior. If someone describes the universe as conscious that’s neat and all, but it doesn’t really mean anything yet. And another person could say it isn’t and neither would be right or wrong, because what the hell is consciousness? Like you said, how are we supposed to measure this when we don’t know what it is? Many people think we haven’t discovered what consciousness is; I believe we haven’t decided what it is.
Depends on who you ask I think. Emergentism makes more sense to me because if you take consciousness as humans experience it, make it derivative of material structure (neurological activity), and assume the appearance of some kind of uniformity as synthesis of different parts of that neurological system, the only way consciousness may exist in that framing is in organisms that posses a nervous system.
This does inevitably leads to the problem of where to draw the line on the complexity necessary to qualify as consciousness, and im.not gonna pretend like I have the answer to that, but at least it becomes more of a scientific question rather than purely philosophical I think.
I prefer to consider it in terms of “dimensions of awareness”. Humans have evolved hundreds, possibly thousands, of interlinked dimensions of awareness for just about everything from colors to body language. Simple automated systems with sensors have their own dimensions of awareness, from vision to heat to pressure. Whatever it is that they track and respond to. AI, however, is finally hitting the point where these dimensions of awareness are being stacked and linked together (GPT5 can see, hear, read, and respond) and it’s only a matter of time and agency (aka executive functioning) before we see true AI consciousness.
I had a similar thought recently actually, that consciousness is more than the brain. Is gt4 conscious? Eh, I don’t believe anyone knows what that means but is it comparable to human consciousness? I don’t think so, but how could it be? It senses words, so it knows words, so it speaks words.
I hear it said all the time that llm’s don’t really understand what they’re talking about, but they seem to understand as well as they can given the dimensions they are aware of, using your terminology. I mean how can I describe anything myself without sensory details? It sounds like. It looks like. It feels like. It behaves like. We got all that knowledge by sensing, then infering. There’s no special sauce that creates understanding from nothing.
I don’t have any links but imo the experiences of people who were born without a sense, and especially those who were later able to gain it back, strongly supports this idea that something can only be conceptualized in the terms that it was sensed in.
Well, hypothetically, if someone defined the “consciousness” of every particle mathematically, and then figured out the laws that would allow us to compute (or at least approximate) the “consciousness” of a composite system (such as a brain), then we’d would have a genuine scientific theory.