I see a red rose. You see a red rose. Is your experience of redness
the same as mine?
1. Yes, they are identical.
2. They are different as long as neural organization of our brains is
slightly different, but you are potentially capable of experiencing my
redness with some help from neurosurgeon who can shape your brain in
the way as mine is.
3. They are different as long as some 'code' of our brains is slightly
different but you (and every machine) is potentially capable of
experiencing my redness if they somehow achieve the same 'code'.
5. They are different and absolutely private - you (and anybody else,
be it a human or machine) don't and can't experience my redness.
6. The question doesn't have any sense because ... (please elaborate)
What is your opinion?
My (naive) answer is (3). Our experiences are identical (would a
correct term be 'ontologically identical'?) as long as they have the
same symbolic representation and the symbols have the same grounding
in the physical world. The part about grounding is just an un-educated
guess, I don't understand the subject and have only an intuitive
feeling that semantics (what computation is about) is important and
somehow determined by the physical world out there.
Let me explain with example. Suppose, that you:
1. simulate my brain in a computer program, so we can say that this
program represents my brain in your symbols.
2. simulate a red rose
3. feed "rose data" into my simulated brain.
I think (more believe than think) that this simulated brain won't see
my redness - in fact, it won't see nothing at all cause it isn't
But if you:
1. make a robot that simulates my brain in my symbols i.e. behaves
(relative to the physical world) in the same ways as I do
2. show a rose to the robot
I think that robot will experience the same redness as me.
Would be glad if somebody suggests something to read about 'symbols
grounding', semantics, etc., I have a lot of confusion here, I've
always thought that logic is a formal language for a 'syntactic'
manipulation with 'strings' that acquire meaning only in our minds.
I agree with your intuition that the semantics of thought and
consciousness must be grounded in interaction with the world and is
relative to that world. It does leave a puzzle about how private
internal thoughts, that seem to have no reference to the external world,
e.g. pure mathematics, get grounded. I guess it is via indirect chains
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to
For more options, visit this group at