Brent Meeker writes:

> > What I meant was, if a computer program can be associated with 
> > consciousness, then a rigid and deterministic computer program can also 
> > be associated with consciousness - leaving aside the question of how 
> > exactly the association occurs. For example, suppose I have a conversation 
> > with a putatively conscious computer program as part of a Turing test, and 
> > the program passes, convincing me and everyone else that it has been 
> > conscious during the test. Then, I start up the program again with no 
> > memory 
> > saved from the first run, but this time I play it a recording of my voice 
> > from 
> > the first test. The program will go through exactly the same resposes as 
> > during the first run, but this time to an external observer who saw the 
> > first 
> > run the program's responses will be no more surprising that my questions 
> > on the recording of my voice. The program itself won't know what's coming 
> > and it might even think it is being clever by throwing is some 
> > "unpredictable" 
> > answers to prove how free and human-like it really is. I don't think there 
> > is any 
> > basis for saying it is conscious during the first run but not during the 
> > second. I 
> > also don't think it helps to say that its responses *would* have been 
> > different 
> > even on the second run had its input been different, because that is true 
> > of 
> > any record player or automaton.
> 
> I think it does help; or at least it makes a difference.  I think you 
> illegitmately 
> move the boundary between the thing supposed to be conscious (I'd prefer 
> "intelligent", because I think intelligence requires counterfactuals, but I'm 
> not 
> sure about consciousness) and its environment in drawing that conclusion.  
> The 
> question is whether the *recording* is conscious.  It has no input.  But then 
> you say 
> it has counterfactuals because the output of a *record player* would be 
> different 
> with a different input.  One might well say that a record player has 
> intelligence - 
> of a very low level.   But a record does not.

Perhaps there is a difference between intelligence and consciousness. 
Intelligence 
must be defined operationally, as you have suggested, which involves the 
intelligent 
agent interacting with the environment. A computer hardwired with "input" is 
not a 
very useful device from the point of view of an observer, displaying no more 
intelligence 
than a film of the screen would. However, useless though it might be, I don't 
see why 
the computer should not be conscious with the hardwired input if it is 
conscious with the 
same input on a particular run from a variable environment. If the experiment 
were set 
up properly, it would be impossible for the computer to know where the input 
was 
coming from. Another way to look at it would be to say that intelligence is 
relative to 
an environment but consciousness is absolute. This is in keeping with the fact 
that 
intelligent behaviour is third person observable but consciousness is not.

Stathis Papaioannou
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---

Reply via email to