[FRIAM] Is consciousness measurable?

glen gepropella at gmail.com
Thu Oct 20 11:12:17 EDT 2022


Fantastic layout! I have a few quibbles, of course. 8^D

(2) and (3) seem like (informal) consequences of *simulation*. Most of the accounts for subjectivity I've found ... [cough] ... resonant are those that include the animal *modeling* its world and running an error correcting process on that model against the world. E.g. predictive processing, Rosen's "anticipation", Solms' Markov Blanket, etc. (2) both risks and perhaps explains premature registration (where we prematurely parse the world to fit our model of it). But (!) I think it's too aggressive to assert that all animals do that. Pond scum, for example?

(3) is particularly problematic. I like the idea of building up to naming from *pointing* or referral; an idea I first learned about from Luc Steels. But that implies that pointing is more primitive than naming. So is it naming we're after? Or pointing? Pointing allows for a less rigid symbol grounding, semiotic triads, and Pierce's 10-fold typology of signs.

I agree that LLMs are not grounded in the same way animals are grounded. But they *are* grounded by association, a postmodernist grounding. *If* there's a stable/essential pattern within any Big Data (like all the books, scientific papers, images, open sourced code), we're developing algorithms to find that pattern. And those essential patterns become the ground ... call it "normative grounding"? The (as yet questionable) hypothesis of some postmodernists was that this is the only grounding we ever really have/had. LLMs are putting the money where the postmodernist mouth is.



On 10/19/22 11:36, Jochen Fromm wrote:
> Emotions are related to a body. I don't think they are absolutely necessary, but I think some kind of body is indeed necessary to develop a form of consciousness. A body which can move around in two different but interconnected worlds, for instance the physical or a virtual world and the world of language.
> 
> The recognition of the own name can then act like a seed or spark for the conscious perception of the own body. This requires a name and a body and some kind of association between them. In chemistry a seed crystal is a small piece of crystal material from which a large crystal is grown in a lab. I think the body acts as a seed crystal for the perception of a self in consciousness and it also solves the symbol grounding problem.
> 
> Therefore a checklist would be
> 1. does it have a body?
> 2. is it - like all animals - perceiving the world as a set of interacting bodies?
> 3. does it have a name?
> 4. is it able to understand objects and actions by connecting them to their name (and accounts of events by associating it with a story) ?
> 
> Then it should be able to make a connection between the own body and the own name. Large language models are fascinating, but they are not grounded in reality (the symbol grounding problem) and they don't have a body.
> 
> Small kids and animals are not fully conscious because they don't understand language. To measure the degree of consciousness one could ask
> 1. Can it recognize the own name (even pets can do this to a certain degree)
> 2. Can it understand names in general?
> 3. Can it read, write and speak the own name?
> 4. Can it understand the own personality and the own personal history?
> 
> -J.

-- 
ꙮ Mɥǝu ǝlǝdɥɐuʇs ɟᴉƃɥʇ' ʇɥǝ ƃɹɐss snɟɟǝɹs˙ ꙮ



More information about the Friam mailing list