AGI,

 

Mike is saying the same thing I always say, only with different words. He is
suggesting that a machine can not be intelligent without a presence in the
real world and experience of real things. I am saying a machine can not be
intelligence without Physics. But in addition, I am saying that this is an
AGI blog and we are trying to build a machine. A machine is a physical
object and it must obey the laws of Physics. That is the crude reality, and
no amount of thinking will help.  

 

 

Sergio

 

 

From: Mike Tintner [mailto:[email protected]] 
Sent: Saturday, September 15, 2012 8:40 AM
To: AGI
Subject: Re: [agi] Simplistic Test of Reason-Based Reasoning

 

Both a book and a computer running a program are inanimate objects - mere
tools which can in certain circumstances produce the illusion of
intelligence. Inanimate objects aren't intelligent.

 

Neither has the slightest capacity for real world intelligence or real world
reasoning - because they do not have a body and therefore the "animate"
capacity to move about the real world, observe it, investigate it, and
gather new, fresh information about it

 

The idea - your idea - that a machine can be intelligent - solve real world
problems - about trees, rocks, houses, chairs, cars, traffic, cities,
people, economics or politics et al - without a presence in the real world
and experience of real things is a fantastic delusion without a scintilla of
evidence -   more fantastic than the most fantastic religious delusion.

 

(I can't BTW recall you ever discussing or thinking about any form of real
world intelligence - if you tried it you would realise just how fantastic a
delusion it is).

 

P.S. I guess you could call it the "dummy" delusion - the belief that a
ventriloquist's dummy can be alive and intelligent about everything - just
because the ventriloquist "breathed life" into it for a few minutes..

 

 

From: Jim Bromer <mailto:[email protected]>  

Sent: Saturday, September 15, 2012 2:14 PM

To: AGI <mailto:[email protected]>  

Subject: Re: [agi] Simplistic Test of Reason-Based Reasoning

 

No it is not like saying that a book can evolve into a human being.  Even a
child can see the difference between a computer and a book.

Jim Bromer

On Sat, Sep 15, 2012 at 4:16 AM, Mike Tintner <[email protected]>
wrote:

PM: One should seriously take a look at Apple's SIRI since a system like
that may evolve into an AGI if it is 

equipped with sufficient back end services (i.e., actions).  

 

This is an absolutely fantastic (but probably commonplace) delusion. It's
like saying - a book (and a book can be organized to function like a
program) can evolve into a human being (or an animal).

 

An infant can see the massive differences between a book and a human being,
but an awful lot of AGI-ers can't.


AGI |  <https://www.listbox.com/member/archive/303/=now> Archives
<https://www.listbox.com/member/archive/rss/303/10561250-164650b2> | Modify
Your Subscription 

 <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> 

 <https://www.listbox.com/member/archive/rss/303/10561250-164650b2>  


 <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> AGI |
Archives | Modify Your Subscription 

 <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> 


 <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> AGI |
Archives | Modify Your Subscription

 <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> 

 <https://www.listbox.com/member/archive/rss/303/10561250-164650b2>  




-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to