> From: Mike Tintner [mailto:[EMAIL PROTECTED]
> 
> I'm developing this argument more fully elsewhere, so I'll just give a
> partial gist. What I'm saying - and I stand to be corrected - is that I
> suspect that literally no one in AI and AGI (and perhaps philosophy)
> present
> or past understands the nature of the tools they are using.
> 
> All the tools - all the sign systems currently used - especially
> language -
> are actually general-purpose - AS USED BY THE HUMAN BRAIN.
> 
> The whole point of just about every word in language is that it
> constitutes
> a general, open brief which can be instantiated in any one of an
> infinite
> set of ways.
> 
> So if I tell you to "handle" an object, or a piece of business, like say
> "removing a chair from the house" - that word "handle" is open-ended and
> gives you vast freedom within certain parameters as to how to apply your
> hand(s) to that object. Your hands can be applied to move a given box,
> for
> example, in a vast if not infinite range of positions and trajectories.
> Such
> a general, open concept is of the essence of general intelligence,
> because
> it means that you are immediately ready to adapt to new kinds of
> situation -
> if your normal ways of handling boxes are blocked, you are ready to seek
> out
> or improvise some strange new contorted two-finger hand position to pick
> up
> the box - which also count as "handling". (And you will have actually
> done a
> lot of this).
> 
> So what is the "meaning" of "handle"? Well, to be precise, it doesn't
> have
> a/one meaning, and isn't meant to - it has a range of possible
> meanings/references, and you can choose which is most convenient in the
> circumstances.
> 
> The same principles apply to just about every word in language and every
> unit of logic and mathematics.
> 
> But - and correct me - I don't think anyone in AI/AGI is using language
> or
> any logico-mathematical systems in this general, open-ended way - the
> way
> they are actually meant to be used - and the very foundation of General
> Intelligence.
> 
> Language and the other systems are always used by AGI in specific ways
> to
> have specific meanings. YKY, typically, wanted a language for his system
> which had precise meanings. Even Ben, I suspect, may only employ words
> in an
> "open" way, in that their meanings can be changed with experience - but
> at
> any given point their meanings will have to be specific.
> 
> To be capable of generalising as the human brain does - and of true AGI
> -
> you have to have a brain that simultaneously processes on at least two
> if
> not three levels, with two/three different sign systems - including both
> general and particular ones.
> 
> 

OK I think I see what you are saying had to think about this for bit. Kind
of interesting that language and to a certain extent mathematics oftimes has
handles which refer to generalities. And in order to maintain a common
understanding, say if two agents were communicating with language there
needs to be one or more moving foci within a fuzzy perimeter of specifics
included within a handle's generality. There is probably two to three and
maybe more of these levels or foci yes.

Definitely English language has this property. Math on the other hand is
different. You can control it more. English language operates within a
region of constraints, it is strongly tied to human communication, which is
convenient since we are human but when you think about it - if you change
and manipulate this 2 or 3 level/foci dynamic you can come up with some
really good and interesting forms of literary expressiveness. And this is
done often with experimental writing. And if one is good at it he can
communicate extremely effectively.

Now relating this to general intelligence if including a creativity modus
operandi within the domain of this foci set may involve some form of general
intelligence for goal attainment. Sure.. I agree that this can be an
operational form of general intelligence, I suppose, but am not sure if this
is more of a communicatory operational protocol that is state driven by
information flow...IOW more of a reactionary thing...

John



-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=98558129-0bdb63
Powered by Listbox: http://www.listbox.com

Reply via email to