> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
> Behalf Of Adrian Hon

[snip]

> > > It's also important not to ignore the role of modulating synaptic
> > > strengths
> > > as well as the formation and dying off of connections; it's
> > > thought that the
> > > strength of synapses is largely responsible for encoding information.
> >
> > Encoding information is not the same concept as a data structure.
> Encoding
> > is what represents what; the structure is how the encoded data is
> organized.
>
> Who said that the information has to be structed in a physical way?

We must be talking past each other -- that seems like a non sequitur to
me...?

> I don't think I see what your point is either. I think it's
> possible for us
> to create an AI that will be able to learn language; this does not
> necessarily require us to fully understand language. I might be able to
> understand the structures in the brain, and how synaptic strengths are
> regulated, and see how on a basic level information might be
> processed, and
> I might be able to reproduced this on a computer, and it might work - but
> none of this requires an understanding of the higher cognitive
> functions of
> the brain.

If we don't understand how the brain does it, then what is the basis for
imagining that we can build something else that will?  I can see what you're
saying, I think -- if we create a computer model of the brain as we
understand it, perhaps it will be able to learn language.  But where's the
evidence to suggest that's the case?  Is there some reason to think that
that's all there is to intelligence and language?  I guess I just see any
strong reasons to believe that we know enough to make something like that
work.

There certainly are example of things that work and we don't know why, but
they are hardly ever invented, they are usually discovered by serendipity.

Nick

Reply via email to