Cool, but ... I maintain that none of this is about consciousness.  Knowledge 
representation, abstraction and compression via symbolism, communication of 
structure, common protocols and standards for describing physical phenomena ... 
these are all intelligence tasks. Specifically, the communication-related stuff 
would be part of social and linguistic intelligence. If you want some labels 
that convey the "inter-agent" aspect without confusing everyone, I think those 
would do.

The thing you call "occupying representation" ... a  conscious agent can do it, 
but an unconscious agent can too.  The ability to decompress information and 
construct models from symbolic communication does not require or imply that the 
agent has its own qualia or first-person experiences.

And I do agree that, for the practical/utilitarian purpose of Getting Things 
Done, this is useful and is all you need for cooperative agents. Like I said 
when I first posted on this thread, phenomenal consciousness is neither 
necessary nor sufficient for an intelligent system.

I think your comment about "Gloobledeglock" actually illustrates my point. 
Communication breaks down here because you haven't tied Gloobledeglock to a 
causative external event. If you said something like, "I feel Gloobledeglock 
whenever I get rained on," then I could surmise (with no guarantee of 
correctness) that you feel whatever it is I feel when I get rained on. 
Observable events, in the world external to both our minds, are things we can 
hold in common and use as the basis of communication protocols. We can't hold 
qualia in common, or transfer them (even partially).

> AGI researchers are so occluded by first-person.

Umm since when? I certainly don't think an AGI system has to be an isolated 
singleton that only deals with first-person information. I think the kerfuffle 
in this thread is about you appearing to claim that Universal Communication 
Protocols and the ability to "occupy representation" are something they are 
not. We're not trying to give an exaggerated importance to phenomenal 
consciousness ... quite the opposite, in fact. We're just saying that the 
systems you describe don't have it.

Signing off now. Good luck with your work.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T41ac13a64c3d48db-M0b99c375007957bfa978963b
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to