Rough approximations maybe . . . . but you yourself have now pointed out that 
your definition is vulnerable to Searle's pathology (which is even simpler than 
the infinite AIXI effect  :-)
  ----- Original Message ----- 
  From: Benjamin Goertzel 
  To: [email protected] 
  Sent: Sunday, May 20, 2007 3:00 PM
  Subject: Re: [agi] Relationship btw consciousness and intelligence



  Sure, that's fine...

  I mean: I have given a mathematical definition before, so all these verbal 
paraphrases
  should be viewed as rough approximations anyway...


  On 5/20/07, Mark Waser <[EMAIL PROTECTED]> wrote:
    Allow me to paraphrase . . . .

        Something is intelligent if it is functional over a wide variety of 
complex goals.

    Is that a reasonable shot at your definition?
      ----- Original Message ----- 
      From: Benjamin Goertzel 
      To: [email protected] 
      Sent: Sunday, May 20, 2007 2:41 PM 
      Subject: Re: [agi] Relationship btw consciousness and intelligence



      Intelligence, to me, is the ability to achieve complex goals...

      This is one way of being functional....  a paperclip though is very 
functional yet not very intelligent...

      ben g



      On 5/20/07, Mark Waser <[EMAIL PROTECTED]> wrote: 
        >> Sure... I prefer to define intelligence in terms of behavioral 
functionality rather than internal properties, but you are free to define it 
differently ;-)

        I wouldn't call learning/adaptability an internal(-only) property . . . 
. 

        >> I note that if the Chinese language changes over time, then the 
{Searle + rulebook} system will rapidly become less intelligent in this context 
!!!! 

        See.  Now this indicates the funkiness of your definition . . . . 
Replace intelligent with functional and it makes a lot more sense.

        Actually, that raises a good question -- What is the difference between 
your "intelligent" and your "functional"?
          ----- Original Message ----- 
          From: Benjamin Goertzel 
          To: [email protected] 
          Sent: Sunday, May 20, 2007 2:11 PM 
          Subject: Re: [agi] Relationship btw consciousness and intelligence



          Sure... I prefer to define intelligence in terms of behavioral 
functionality rather than internal properties, but you are free to define it 
differently ;-)

          I note that if the Chinese language changes over time, then the 
{Searle + rulebook} system will rapidly become less intelligent in this context 
!!!! 

          ben g


          On 5/20/07, Mark Waser <[EMAIL PROTECTED]> wrote: 
            I liked most of your points, but . . . . 

            >> However, Searle's example is pathological in the sense that it 
posits a system with a high degree of intelligence associated with a 
functionality that is NOT associated with any intensity-of-consciousness.  But 
I suggest that this pathology is due to the unrealistically large amount of 
computing resources that the rulebook requires.  

            Not by my definition of intelligence (which requires 
learning/adaptation).


              ----- Original Message ----- 
              From: Benjamin Goertzel 
              To: [email protected] 
              Sent: Sunday, May 20, 2007 1:24 PM 
              Subject: [agi] Relationship btw consciousness and intelligence



              Hi all,

              Someone emailed me recently about Searle's Chinese Room argument, 

              http://en.wikipedia.org/wiki/Chinese_room 

              a topic that normally bores me to tears, but it occurred to me 
that part of my reply might be of interest to some 
              on this list, because it pertains to the more general issue of 
the relationship btw consciousness and intelligence.

              It also ties in with the importance of thinking about "efficient 
intelligence" rather than just raw intelligence, as 
              discussed in the recent thread on definitions of intelligence.

              Here is the relevant part of my reply about Searle:

              ****
              However, a key point is: The scenario Searle describes is likely 
not physically possible, due to the unrealistically large size of the rulebook. 
 The structures that we associate with intelligence (will, focused awareness, 
etc.) in a human context, all come out of the need to do intelligent processing 
within modest space and time requirements.  

              So when we say we feel like the {Searle+rulebook} system isn't 
really understanding Chinese, what we mean is: It isn't understanding Chinese 
according to the methods we are used to, which are methods adapted to deal with 
modest space and time resources.

              This ties in with the relationship btw intensity-of-consciousness 
and degree-of-intelligence.  In real life, these seem often to be tied 
together, because the cognitive structures that correlate with intensity of 
consciousness are useful ones for achieving intelligent behaviors.

              However, Searle's example is pathological in the sense that it 
posits a system with a high degree of intelligence associated with a 
functionality that is NOT associated with any intensity-of-consciousness.  But 
I suggest that this pathology is due to the unrealistically large amount of 
computing resources that the rulebook requires.  

              I.e., it is finitude of resources that causes intelligence and 
intensity-of-consciousness to be correlated.  The fact that this correlation 
breaks in a pathological, physically-impossible case that requires dramatically 
much resources, doesn't mean too much...
              ****

              Note that I write about intensity of consciousness rather than 
presence of consciousness.  I tend toward panpsychism but I do accept that 
"while all animals are conscious, some animals are more conscious than others" 
(to pervert Orwell).  I have elaborated on this perspective considerably in The 
Hidden Pattern. 


              -- Ben G 
------------------------------------------------------------------
              This list is sponsored by AGIRI: http://www.agiri.org/email
              To unsubscribe or change your options, please go to:
              http://v2.listbox.com/member/?&;

--------------------------------------------------------------------
            This list is sponsored by AGIRI: http://www.agiri.org/email
            To unsubscribe or change your options, please go to: 

            http://v2.listbox.com/member/?&; 


----------------------------------------------------------------------
          This list is sponsored by AGIRI: http://www.agiri.org/email
          To unsubscribe or change your options, please go to:
          http://v2.listbox.com/member/?&;

------------------------------------------------------------------------
        This list is sponsored by AGIRI: http://www.agiri.org/email
        To unsubscribe or change your options, please go to: 
        http://v2.listbox.com/member/?&; 


--------------------------------------------------------------------------
      This list is sponsored by AGIRI: http://www.agiri.org/email
      To unsubscribe or change your options, please go to:
      http://v2.listbox.com/member/?&;

----------------------------------------------------------------------------
    This list is sponsored by AGIRI: http://www.agiri.org/email
    To unsubscribe or change your options, please go to: 
    http://v2.listbox.com/member/?&; 


------------------------------------------------------------------------------
  This list is sponsored by AGIRI: http://www.agiri.org/email
  To unsubscribe or change your options, please go to:
  http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to