One could use a semi-virtual domain for an AGI:  i.e., a domain where the 
avatar of an AGIis embedded in a virtual domain, but the avatar also has 
interaction with a human user in the real world.  

~PM.
Date: Sat, 20 Oct 2012 22:14:26 -0400
Subject: Re: [agi] Robot-Based AGI is Not a Necessity to Make Intelligence to 
Emerge
From: [email protected]
To: [email protected]



On Sat, Oct 20, 2012 at 12:30 PM, Alan Grimes <[email protected]> wrote:

Jim Bromer wrote:




There is no experimental basis for concluding that robotic interactions

are necessary for genuine AGI to emerge.  And, based on insight about

how computers actually work the belief that computer-robotic

connectivity is essential for AGI to emerge is not insightful.  Computer

programs react to data whether they are in robots or not.  Its all just

data.




Strictly speaking, that's true. However, I would argue that some kind of closed 
cybernetic loop is required for procedural learning to take place. I would also 
argue that it is much easier for the monkey programming the thing to think in 
terms of a robot than in some abstract domain.
  We are able to take symbols further than monkeys.  So we do not have to work 
with robots in order to avoid thinking of abstract domains.  We can deal with 
abstract ideas.  We can abstract based on our knowledge of things to determine 
what is necessary and essential and what may enhance but is not necessary.  
Drinking water is not necessary for computer intelligence but it is necessary 
for almost all kinds of living intelligence.  Just because knowledge of the 
world is much more intimately involved in our thinking it does not mean that it 
is absolutely necessary for intelligence.  If straight interaction with the 
world through sensors that are comparable to the senses that we have then why 
hasn't any sensory AI project actually succeeded to procuce general 
intelligence.  We might imagine IO data worlds which are different the world 
that is revealed to the human senses.  And we can think of sensors that humans 
lack without concluding that anything with those kinds of sensory modalities 
must be capable of far greater intelligence than us.  Although this sounds like 
it is a little close to sophistry, it isn't because it makes sense (so to 
speak).  We can load any kind of AI program up with all kinds of sensors and 
you are still not going to get genuine intelligence without first discovering a 
way to do it.
 Jim Bromer



  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to