On Wednesday, August 28, 2019, at 6:49 PM, WriterOfMinds wrote:
> Great, seems like we've reached agreement on something.
> When we communicate with words like "red," we're really communicating about 
> the frequency of light. I would argue that we are not communicating our 
> qualia to each other. If we could communicate qualia, we would not have this 
> issue of being unable to know whether your green is my red. Qualia are 
> personal and incommunicable *by definition,* and it's good to have that 
> specific word and not pollute it with broader meanings.

We can't fully communicate our qualia only a representation of that which we 
ourselves loose the exact reconstruction of. That's the inter-agent part of it. 
How do you know any qualia ever existed? They are communicated. They are fitted 
into words/symbols. IMO like a pointer in the programming sense. This is all 
utilitarian not philosophical.

On Wednesday, August 28, 2019, at 6:49 PM, WriterOfMinds wrote:
> In the mouse example, I was assuming that I had fully modeled the 
> electro-mechanical phenomena in *this specific* mouse. I still don't think 
> that would give me its qualia.

There is only a best guess within the context of the observer...

On Wednesday, August 28, 2019, at 6:49 PM, WriterOfMinds wrote:
> I would be happy to refer to a machine with an incommunicable first-person 
> subjective experience stream as "conscious." But you've admitted that you're 
> not trying to talk about incommunicable first-person subjective experiences, 
> you're trying to talk about communication. I'm not concerned with whether the 
> "consciousness" is mechanical or biological, natural or artificial; I'm 
> concerned with whether it's actually "consciousness."

A sample, lossily compressed internally, symbolized. We loose the original 
basically. You can't transmit the whole qualia it's gone. Yes the utilitarian 
aspect of it is that it is all about communication in a system of agents.  
Everything is not first-person. AGI researchers are so occluded by 
first-person. Human general intelligence in not one person but a system of 
people... a baby dies in isolation.

Another piece of this is occupying representation. A phenomenal conscious 
observer may assume the structure that is transmitted in its symbolic form and 
attempt to reconstruct the original lossy representation based on it's own 
experience.  

Not really aiming for human phenomenal consciousness now but more panpsychist. 
Objects inherently contain structure that can be extracted into discrete 
representation that can be fitted systematically with similar structure of 
other objects.

...

I want to tell you a secret but it's incommunicable. Guess what. It's already 
been communicated.

Can I ask you a question? Thanks, no need to answer.

I felt a unique incommunicable sensation. I call it Gloobledeglock.  Have you 
ever felt Gloobledeglocked?

John

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T41ac13a64c3d48db-Mfcb6e0f90becb8dba4791d4a
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to