On 15/06/06, arnoud <[EMAIL PROTECTED]> wrote:
On Thursday 15 June 2006 21:35, Ben Goertzel wrote:
> > If this doesn't seem to be the case, this is because of that some
> > concepts are so abstract that they don't seem to be tied to perception
> > anymore. It is obvious that they are (directly) tied to more concrete
> > concepts (be defined/described in terms of ...), but those concepts can
> > also still be very abstract. And so abstract concepts can seem to only
> > depend on other abstract concepts, and together lead their own life, not
> > tied to/determined by perception/sensation. However, if you would/could
> > trace all the dependencies of any concept you would end up on the
> > perception level.
>
> Hmmm... well, although I learned mathematics via perceiving books and
> spoken words and so forth,

And by interacting with the world: counting objects, rotating objects,
translating objects, manipulating sequences of symbols on the basis of
rules... And making predictions about the effect of your actions.

This conversation reminds me of paper I read by Aaron Sloman recently
on symbol grounding vs symbol tethering. I can't find the one I read
but here is a similar one

http://www.cs.bham.ac.uk/research/cogaff/talks/meaning-types-slides.pdf

I am definately of the symbol tethering view. One of the example
Sloman uses towards tethering (that is loose coupling) is the concept
of neutrinos, that certainly hasn't helped me predict the affect of my
actions, apart from in the very loose way that they help me predict
what I may find on wikis when I look up neutrinos.

Which is as useful as knowing information about soccer players. And
yet I value the neutrino information more, because of the way I have
been told it connects with all the other information that has been
useful when I fiddled about with chemicals.

Will Pearson

ps If people are wondering why, I, someone interested in evolutionary
systems and distributed representation is interested in the symbol
grounding arguments, it is because eventually the starting modules
that evolve will have to be as least as complex as the systems people
are suggesting for AGI and have to have something like the shared
representation and world modelling people suggest. That is I expect to
put a fair amount of precocial programming, that then evolves, in the
system.

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to