I liked most of your points, but . . . . 

>> However, Searle's example is pathological in the sense that it posits a 
>> system with a high degree of intelligence associated with a functionality 
>> that is NOT associated with any intensity-of-consciousness.  But I suggest 
>> that this pathology is due to the unrealistically large amount of computing 
>> resources that the rulebook requires.  

Not by my definition of intelligence (which requires learning/adaptation).


  ----- Original Message ----- 
  From: Benjamin Goertzel 
  To: [email protected] 
  Sent: Sunday, May 20, 2007 1:24 PM
  Subject: [agi] Relationship btw consciousness and intelligence



  Hi all,

  Someone emailed me recently about Searle's Chinese Room argument, 

  http://en.wikipedia.org/wiki/Chinese_room

  a topic that normally bores me to tears, but it occurred to me that part of 
my reply might be of interest to some 
  on this list, because it pertains to the more general issue of the 
relationship btw consciousness and intelligence.

  It also ties in with the importance of thinking about "efficient 
intelligence" rather than just raw intelligence, as 
  discussed in the recent thread on definitions of intelligence.

  Here is the relevant part of my reply about Searle:

  ****
  However, a key point is: The scenario Searle describes is likely not 
physically possible, due to the unrealistically large size of the rulebook.  
The structures that we associate with intelligence (will, focused awareness, 
etc.) in a human context, all come out of the need to do intelligent processing 
within modest space and time requirements.  

  So when we say we feel like the {Searle+rulebook} system isn't really 
understanding Chinese, what we mean is: It isn't understanding Chinese 
according to the methods we are used to, which are methods adapted to deal with 
modest space and time resources.

  This ties in with the relationship btw intensity-of-consciousness and 
degree-of-intelligence.  In real life, these seem often to be tied together, 
because the cognitive structures that correlate with intensity of consciousness 
are useful ones for achieving intelligent behaviors.

  However, Searle's example is pathological in the sense that it posits a 
system with a high degree of intelligence associated with a functionality that 
is NOT associated with any intensity-of-consciousness.  But I suggest that this 
pathology is due to the unrealistically large amount of computing resources 
that the rulebook requires.  

  I.e., it is finitude of resources that causes intelligence and 
intensity-of-consciousness to be correlated.  The fact that this correlation 
breaks in a pathological, physically-impossible case that requires dramatically 
much resources, doesn't mean too much...
  ****

  Note that I write about intensity of consciousness rather than presence of 
consciousness.  I tend toward panpsychism but I do accept that "while all 
animals are conscious, some animals are more conscious than others" (to pervert 
Orwell).  I have elaborated on this perspective considerably in The Hidden 
Pattern. 

  -- Ben G 
------------------------------------------------------------------------------
  This list is sponsored by AGIRI: http://www.agiri.org/email
  To unsubscribe or change your options, please go to:
  http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to