On Thursday, September 25, 2025, at 2:24 PM, twenkid wrote:
> @Darko: "the illusion of consciousness" - what it is like to have an illusion 
> of consciousness (to have it subjectively). If you can have "illusions" don't 
> you need to be conscious already. Indeed, the "illusionism" school of 
> consciousness is ridiculous. The consciousness is an illusion - to whom? To 
> an entity which has no consciousness.
> 
> The LLMs and whatever systems that can be "animated" in the mind of the 
> evaluator-obsever (and LLMs are whatever and wherever they are and "how-ever" 
> they are factorized and segmented - see "Stack theory is yet another...") 
> receive the consciousness of the evaluator-observer who interacts with them.

The LLM can appear to have a phenomenological consciousness simulated to 
clients via word prediction and/or an internal system to give it a more 
elaborate classical world consciousness, a synthetic self-illusionary system 
that emits qualia descriptions. For a Hoffman type wire it into an amplituhedra 
model:

https://www.mdpi.com/1099-4300/25/1/129

But LLM's are getting smart enough to build their own consciousness models so 
they could test out different ones to try to get closer to human-like 
consciousness and even generate their own synthetic data psychedelics. And have 
a markup language where LLM's share their findings with each other. It'd be 
like, "Hey Grok-6.01 Dizzy this is Qwen5Z SuperDuper let me try a hit of that, 
send it to my MCP extension address port 12321. I ACL'd that port for you."

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T8f6c809749765377-M15cb35ab916f946c93d571db
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to