J Storrs Hall, PhD wrote:
basically on the right track -- except there isn't just one "cognitive level".
Are you thinking of working out the function of each topographically mapped
area a la DNF? Each column in a Darwin machine a la Calvin? Conscious-level
symbols a la Minsky?
Of course you can make finer distinctions, and different people use the
term "cognitive" in different ways. My usage of the term is coextensive
with the usage in cognitive science and cognitive psychology, but that
covers a multitude of sins.
To the extent that an approach tries to embrace what is known about
human cognition it would be "cognitive", but if it took little notice of
that, it would not. Regular AI does not take much account of human
cognition. Neuroscience (even 'cognitive' or 'computational'
neuroscience) takes a very superficial attitude toward all things
cognitive, even when it says that it is doing otherwise (a sore point in
the literature, right now).
But anything that takes significant account of cognition is very
different from an approach that involves scanning a brain and trying to
make a copy without understanding exactly how it works. It is that
enormous gap that I was pointing to, and the fact that there are many
different ways of taking a significant account of cognition does not
make much difference to that gap.
Richard Loosemore
On Thursday 05 June 2008 09:37:00 pm, Richard Loosemore wrote:
There seems to be a good deal of confusion (on this list and also over
on the Singularity list) about what people actually mean when they talk
about building an AGI by emulating or copying the brain.
There are two completely different types of project that seem to get
conflated in these discussions:
1) Copying the brain at the neural level, which is usually assumed to be
a 'blind' copy - in other words, we will not know how it works, but will
just do a complete copy and fire it up.
2) Copying the design of the human brain at the cognitive level. This
may involve a certain amount of neuroscience, but mostly it will be at
the cognitive system level, and could be done without much reference to
neurons at all.
Both of these ideas are very different from standard AI, but they are
also very different from one another. The criticisms that can be
leveled against the neural-copy approach do not apply to the cognitive
approach, for example.
It is frustrating to see commentaries that drift back and forth between
these two.
My own position is that a cognitive-level copy is not just feasible but
well under way, whereas the idea of duplicating the neural level is just
a pie-in-the-sky fantasy at this point in time (it is not possible with
current or on-the-horizon technology, and will probably not be possible
until after we invent an AGI by some other means and get it to design,
build and control a nanotech brain scanning machine).
Duplicating a system as complex as that *without* first understanding it
at the functional level seems pure folly: one small error in the
mapping and the result could be something that simply does not work ...
and then, faced with a brain-copy that needs debugging, what would we
do? The best we could do is start another scan and hope for better luck
next time.
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com