On Wednesday, August 28, 2019, at 5:09 PM, WriterOfMinds wrote:
> People can only communicate their conscious experiences by analogy. When you 
> say "I'm in pain," you're not actually describing your experience; you're 
> encouraging me to remember how I felt the last time *I* was in pain, and to 
> assume you feel the same way. We have no way of really knowing whether the 
> assumption is correct.
> 

That’s protocol. They sync up. We are using an established language but it 
changes over time.  The word "Pain" is a transmitted compression symbol that's 
understood already not to be always the same but the majority of others besides 
oneself have a similar experience. Some people get pleasure from pain due to 
different wiring or neurochemicals or whatever. There might be a societal 
tendency for them not to breed


On Wednesday, August 28, 2019, at 5:09 PM, WriterOfMinds wrote:
> We can both name a certain frequency of light "red" and agree on which 
> objects are "red." But I can't tell you what my visual experience of red is 
> like, and you can't tell me what yours is like. Maybe my red looks like your 
> green -- the visual experience of red doesn't seem to inhere in the 
> frequency's numerical value, in fact color is nothing like number at all, so 
> nothing says my red isn't your green. "Qualia" refers to that indescribable 
> aspect of the experience. If your "qualia" can be communicated with symbols, 
> or described in terms of other things, then we're not talking about the same 
> concept -- and using the same word for it is just confusing.

Think multi-agent. Say my red is your green and your green is my red. We are 
members of a species sampling the environment. If we all saw it the same way it 
would impact evolution? You don’t know my qualia on red. But you do understand 
me communicating the experience using words and symbols generally understood 
and that is what matters in the multi-agent computational standpoint. We are 
multi-sensors emitting compressed samples via symbol transmission hoping the 
external world understands, but the initial sample is lossily compressed and 
fitted into a symbol to traverse a distance. We may never know that your green 
is my red.


On Wednesday, August 28, 2019, at 5:09 PM, WriterOfMinds wrote:
> Going back to your computer-and-mouse example: if I admit your panpsychist 
> perspective and assume that a computer mouse has qualia, those qualia are not 
> identified with the electro-mechanical events inside the mouse.  I could have 
> full knowledge of those (fully compute or model them) without sharing the 
> mouse's experience.

You can compute mouse electro-mechanical at a functional level but between two 
mice there are actual vast differences in electron flow and microscopic 
mechanical differences. You still are only estimating what is actually going 
on, or the K-complexity or qualia. There could be self-correcting errors in one 
but the signal clicks to external entities is the same...

Please note that terminology gets usurped with technology when implemented. 
Should we not call intelligence intelligence? Usually it is prepended with 
artificial but IMO wrong move there. It is intelligence or better machine 
intelligence.  Should we not call an artificial eye an eye? What's so special 
about the word consciousness that everyone gets all squirmy about?

John

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T41ac13a64c3d48db-M640c41a41bf4e294765e68a3
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to