On Nov 20, 2008, at 10:38 AM, Brent Meeker wrote: > I think you really you mean nomologically possible.
I mean logically possible, but I'm happy to change it to "nomologically possible" for the purposes of this conversation. > I think Dennett changes the question by referring to > neurophysiological "actions". Does he suppose wetware can't be > replaced by > hardware? No, he definitely argues that wetware can replaced by hardware, as long as the hardware retains the computational functionality of the wetware. > In general when I'm asked if I believe in philosophical zombies, I > say no, > because I'm thinking that the zombie must outwardly behave like a > conscious > person in all circumstances over an indefinite period of time, yet > have no inner > experience. I rule out an accidental zombie accomplishing this as > to improbable > - not impossible. I agree. But if you accept that it's nomologically possible for a robot with a random-number-generator in its head to outwardly behave like a conscious person in all circumstances over an indefinite period of time, then your theory of consciousness, one way or another, has to answer the question of whether or not this unlikely robot is conscious. Now, maybe your answer is "The question is misguided in that case, and here's why..." But that's a significant burden. -- Kory --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to [EMAIL PROTECTED] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/everything-list?hl=en -~----------~----~----~----~------~----~------~--~---