There are some systems which take a 3D picture of the real world, then figure 
out the desired state, then run different action sequences in an internal 3D 
simulator  attempting to achieve the desired state, then executes the solution 
in the real world.  But that is simulation not emotion.

From: [email protected]
To: [email protected]
Subject: Re: [agi] Why Emotions are too sophisticated for early AGI robots
Date: Thu, 20 Jun 2013 19:36:48 +0100







Interesting to see this stuff – but it is strictly simple narrow AI – block 
world stuff,  with blocks, cylinders and the like – objects whose paths are 
not difficult to simulate/ predict. And there is no attempt here to explain how 
the approach could be *generalised* to simulating/predicting other objects and 
scenes.
 
I’m still not sure BTW what form the actual “simulation” here consists of – 
and perhaps you could explain. Is the simulation in the computer/robot’s brain, 
for example, a scenic one, like the scenes that confront you every second in 
waking consciousness and dream consciousness? An integrated picture of objects 
in a field?  Or is it an *analysis* of a scene and the objects within a 
field? Something very different.
 
Again and again, I find, programmers fail to distinguish between what is 
going on in the computer’s brain and what is going on in their own human brain 
as they survey the results of the computer’s work. It’s usually the 
human-using-the-computer/robot who is doing the real seeing and the real 
simulation, not the machine.


 

From: Piaget Modeler 
Sent: Thursday, June 20, 2013 6:53 PM
To: AGI 

Subject: RE: [agi] Why Emotions are too sophisticated for early AGI 
robots
 

Yiannis Demiris has done work in this area:  
http://www.iis.ee.ic.ac.uk/yiannis/JohnsonDemirisTAROS05.pdf 
 
As have others.  Just search for articles about "mental simulation" or 
"forward models". 
 
Cheers,
 
~PM




From: [email protected]
To: [email protected]
Subject: Re: [agi] Why 
Emotions are too sophisticated for early AGI robots
Date: Thu, 20 Jun 2013 
16:38:05 +0100






PM: "Running projective 
movies" is "mental simulation", which can already be done 
by 
computers.
 
Example? I think you’ll find that computers can 
truly simulate with movies like they can truly understand and talk 
language   – not at all. Some minimal appearances but no AGI 
realities.


 

From: Piaget Modeler 
Sent: Thursday, June 20, 2013 4:12 PM
To: AGI 

Subject: RE: [agi] Why Emotions are too sophisticated for early AGI 
robots
 

Fear is avoidance behavior and pleasure is pursuit behavior.  
Computer programs 
can already reject or pursue goals.

 
"Running projective movies" is "mental simulation", which can already be 
done 
by computers. 

I think it's hard for you Mike because you have a 
vague definition of emotion. 
If a researcher operationalizes his definition, then he can create 
something that
has emotion.   By the way, many researchers have already 
operationalized the 
definition of Emotion.
 
~PM
 
------------------


> From: [email protected]
> To: [email protected]
> 
Subject: [agi] Why Emotions are too sophisticated for early AGI robots
> 
Date: Thu, 20 Jun 2013 10:10:59 +0100
> 
> Ironically, given Ben's 
post today, I was just thinking about emotions and 
> AGI robots - because 
I was on a vid. conference this week with Robert 
> Wenzel, who, also 
inspired by David Hanson, has some kind of AGI project 
> that wants to 
give robots emotions.
> 
> Nah, way too sophisticated I said - you 
always have to look to evolution - 
> and you see that evolution only 
introduces emotions down the line. I didn't 
> immediately have a precise 
reason why, though I knew it had something to do 
> with the complexity of 
journeys/activities that a creature undertakes. The 
> more complex the 
creature, the more complex its journeys/activities.
> 
> The more 
precise reason I now realise is that emotions demand *great powers 
> of 
reflection* - projective reflection of what WILL happen.
> 
> Take 
simple basic emotions like fear (or pleasure).
> 
> Ideally, you 
want a robot that can be afraid - afraid of a predator, say, 
> or simply 
falling off a cliff edge.
> 
> When you see a predator, the predator 
isn't actually doing anything to you. 
> You're afraid that he WILL do 
something to you. Ditto, on the cliff edge, 
> you're not actually falling 
or incurring injury. You're afraid that you WILL 
> fall off it.
> 

> Emotions then involve the capacity to run *projective movies* of what 
will 
> happen - the predator attacking you, your falling off the cliff. 
In addition 
> they require a bicameral mind, because the movies have to 
be run most of the 
> time in an unconscious mind , while the conscious 
mind attends to the 
> immediate situation.
> 
> Many of you 
guys will think you can achieve this by just attaching a few 
> symbols to 
the brain, and linking some reflex reactions. No. You have to be 
> able 
both to learn and unlearn new emotions - and that can only happen by 
> 
storing and rerunning movies. Emotions are extremely sophisticated 
stuff.
> 
> First we need general robots that can, like paramoecia 
or simple organisms, 
> creatively plot and execute many - potentially 
infinite - different paths 
> and routes to goals, by contrast with 
present narrow AI robots that only 
> have a few avenues. True autonomous 
mobile robots. Emotions - and emotive 
> robots - will come much later. 

> 
> 
> 
> 
-------------------------------------------
> AGI
> Archives: 
https://www.listbox.com/member/archive/303/=now
> RSS Feed: 
https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc
> Modify 
Your Subscription: https://www.listbox.com/member/?&;
> Powered by 
Listbox: http://www.listbox.com



  
  
    AGI | Archives  | Modify Your 
      Subscription 
    


  
  
    AGI | Archives  | Modify Your 
      Subscription
    


  
  
    AGI | Archives  | Modify 
      Your Subscription 
    


  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to