On Monday, August 26, 2019, at 5:25 PM, WriterOfMinds wrote:
> "What it feels like to think" or "the sum of all a being's qualia" can be 
> called phenomenal consciousness. I don't think this type of consciousness is 
> either necessary or sufficient for AGI. If you have an explicit goal of 
> creating an Artificial Phenomenal Consciousness ... well, good luck. 
> Phenomenal consciousness is inherently first-person, and measuring or 
> detecting it in anyone but yourself is seemingly impossible. Nothing about an 
> AGI's structure or behavior will tell you what its first-person experiences 
> *feel* like, or if it feels anything at all.


Qualia = Compressed impressed samples symbolized for communication. From the 
perspective of other agents attempting to Occupy Representation of another 
agents phenomenal consciousness would be akin to computing it's K-complexity. 
Some being commutable some being estimable.

Why does this help AGI? This universe has inherent 
separateness/distributedness. It's the same reason why there is no single 
general compression algorithm.

John



------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T41ac13a64c3d48db-M09d11c426cbd235dd276652c
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to