> You've completely missed the point again. Perhaps you could try > reading Chalmers' paper if you haven't already done so: > > http://consc.net/papers/qualia.html > > Unfortunately some people just don't seem to understand it.
I have read it, and it's a good way of understanding the issue if you are going to use the standard models of consciousness, but I have a model that I like better. Have you read my executive summary? I'm looking at subjectivity as the inverted, involuted topology of what we can observe (through our particular private, proprietary subjectivity). It is not a process which arises at some level or other. It's just that every phenomenon can only identify with what is very similar to itself. The sense that it can make of everything else is objectified - inside out. So making a brain out of something other than brain depends entirely upon how different it is from what it can identify with. That may prove to be achievable at a non-biological level, but there is no particular reason to imagine that should be the case. Human consciousness is as dependent upon human biology as water is dependent on H2O. They are the same thing. Silicon microprocessors are not the same thing. Programs that run on microprocessors aren't the same thing. If they were you could just make a program that makes simulates ever more powerful processors and greater quantities of memory. Let's start with that, because it will be a lot easier. Let's write a program that simulates itself running faster than it can run. Craig -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.

