Because consciousness is a singularity of perspective through time, or rather through which time is created. > Why would something > created by someone else not have consciousness? Because it is assembled rather than created. It's like asking why wood doesn't catch on fire by itself just by stacking it in a pile. > Why would something > lacking internally generated motives (which does not apply to > computers any more than to people) lack consciousness? Why would computers have an internally generated motive? It doesn't care whether it functions or not. We know that people have personal motives because it isn't possible for us to doubt it without doubting our ability to doubt. > To make these > claims you would have to show either that they are necessarily true or > present empirical evidence in their support, and you have done > neither. > You would have to show that these criteria are relevant for consciousness, which you have not, and you cannot. As long as you fail to recognize consciousness as the ground of being, you will continue to justify it against one of its own products - rationality, logic, empirical examples, all of which are 100% sensory-motor. Consciousness can only be explained to consciousness, in the terms of consciousness, to satisfy consciousness. All other possibilities are subordinate. How could it be otherwise without ending up with a sterile ontology which prohibits our own participation? > > >> So if, in future, robots live among us for years and are accepted by > >> most people as conscious, does that mean they are conscious? This is > >> essentially a form of the Turing test. > > > > > > I don't think that will happen unless they aren't robots. The whole > point is > > that the degree to which an organism is conscious is inversely > proportionate > > to the degree that the organism is 100% controllable. That's the purpose > of > > intelligence - to advance your own agenda rather than to be overpowered > by > > your environment. So if something is a robot, it will never be accepted > by > > anyone as conscious, and if something is conscious it will never be > useful > > to anyone as a robot - it would in fact be a slave. > > You don't think it would happen, but would you be prepared to say that > if a robot did pass the test, as tough as you want to make it, it > would be conscious? > It's like asking me if there were a test for dehydrated water, would I be prepared to say that it would be wet if it passed the test. No robot can ever be conscious. Nothing conscious can ever be a robot. Heads cannot be Tails, even if we move our heads to where the tails side used to be and blink a lot. Craig > > > -- > Stathis Papaioannou > -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to firstname.lastname@example.org. Visit this group at http://groups.google.com/group/everything-list?hl=en. For more options, visit https://groups.google.com/groups/opt_out.