On Sun, Dec 18, 2005 at 11:20:29AM -0500, Pei Wang wrote:

> Of course, most of the limitations of NN can be avoid by generalizing
> the concept to such a level. However, at the same time, such a general
> notion does not support the claims of advantages of NN, either. How

Biological cognition is based on network processing, too.

> can someone argue that this concept can compete with the classical
> symbolic model of cognition? Can semantic network be counted as a kind

Because you're reading this message in realtime NNs are clearly a quite
powerful model. In fact, since we don't have any human-equivalent symbolic
processing systems, the burden of proof is reversed.

> of neural network? Once again, when the extension of a concept gets
> larger and larger, its intension gets smaller and smaller.

The space of automata networks is vast and almost utterly barren.
The probability of hitting a fertile spot with an educated guess
is basically nil. Of course biological tissue processing does some
fancy tricks ANNs can't yet.
 
> > As for whether NN's are the best architecture for AGI right now, I
> > agree that they are not.  My reason is that no one knows how complex,
> > abstract knowledge can be efficiently, adaptably represented in NN's.

But you don't have to understand the representation in order to be
able to build very successful, superhuman intelligences. The evolution
process is not sentient, and rather straightforward, yet it can produce
at the very least human-level intelligence.

> > I'm sure there *is* a way to do it, but since no has discovered it yet
> > (via either mathematical theory, computational experimentation, or
> > neuroscience), basically in my view NN-based AGI is a non-starter.
> 
> Agree.

Disagree. It is curious why 'mathematical theory' is supposed to be
useful for AGI. There has been hitherto no attempt to map a parameter
space, nor is there sufficiently powerful hardware to execute 10^9 node
networks in realtime.
 
> > Once this conceptual problem is solved, then NN-based AGI may become a
> > good strategy.  However, I still think it will wind up being worse
> > than probabilistic logic based AGI, at least on von Neumann computers
> > (or networks thereof), because NN architectures don't take good

There won't be any AGI on any current memory-starved, few-threaded
hardware.

> > advantage of the particular strengths of von Neumann computers (which

I presume you're including Harvard along with von Neumann. I do not
see many advantages or strenths of sequential machines for AGI, I must
admit.

> > can do very precise operations very quickly in serial, a quite
> > different strength from the human brain, and one which is in large

AGI is not cryptography.

> > part wasted when computers are used for NN computation)
> 
> Don't fully agree, but that is a separate issue.

-- 
Eugen* Leitl <a href="http://leitl.org";>leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820            http://www.ativel.com
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE

-------
To unsubscribe, change your address, or temporarily deactivate your 
subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Attachment: signature.asc
Description: Digital signature

Reply via email to