Dr. Heger,

Point #3 is brilliantly stated.  I couldn't have expressed it better.  And
I know this because I've been trying to do so, in slightly broader terms,
for months on this list.  Insofar as providing an AGI with a human-biased
sense of space and time is required to create a human-like AGI (what I
prefer to call AG*H*I), I agree it is a mistake.

More generally, as long as AGI designers and developers insist on
simulating human intelligence, they will have to deal with the AI-complete
problem of natural language understanding.  Looking for new approaches to
this problem, many researches (including prominent members of this list)
have turned to "embodiment" (or "virtual embodiment") for help.  IMHO, this
is not a sound tactic because human-like embodiment is, itself, probably an
AI-complete problem.

Insofar as achieving human-like embodiment and human natural language
understanding is possible, it is also a very dangerous strategy.  The
process of understanding human natural language through human-like
embodiment will, of necessity, lead to the AGHI developing a sense of self.
 After all, that's how we humans got ours (except, of course, the concept
preceded the language for it).  And look how we turned out.

I realize that an AGHI will not "turn on us" simply because it understands
that we're not (like) it (i.e., just because it acquired a sense of self).
  But, it could.  Do we really want to take that chance?  Especially when
it's not necessary for human-beneficial AGI (AGI without the "silent H")?

Cheers,
Brad


Dr. Matthias Heger wrote:
> 1. We feel ourselves not exactly at a single point in space. Instead, we
> identify ourselves with our body which consist of several parts and which
> are already at different points in space. Your eye is not at the same place
> as your hand.
> I think this is a proof that a distributed AGI will not need  to have a
> complete different conscious state for a model of its position in space than
> we already have.
> 
> 2.But to a certain degree you are of course right that we have a map of our
> environment and we know our position (which is not a point because of 1) in
> this map. In the brain of a rat there are neurons which each represent a
> position of the environment. Researches could predict the position of the
> rat only by looking into the rat's brain.
> 
> 3. I think it is extremely important, that we give an AGI no bias about
> space and time as we seem to have. Our intuitive understanding of space and
> time is useful for our life on earth but it is completely wrong as we know
> from theory of relativity and quantum physics. 
> 
> -Matthias Heger
> 
> 
> 
> -----Ursprüngliche Nachricht-----
> Von: Mike Tintner [mailto:[EMAIL PROTECTED] 
> Gesendet: Samstag, 4. Oktober 2008 02:44
> An: agi@v2.listbox.com
> Betreff: [agi] I Can't Be In Two Places At Once.
> 
> The foundation of the human mind and system is that we can only be in one 
> place at once, and can only be directly, fully conscious of that place. Our 
> world picture,  which we and, I think, AI/AGI tend to take for granted, is 
> an extraordinary triumph over that limitation   - our ability to conceive of
> 
> the earth and universe around us, and of societies around us, projecting 
> ourselves outward in space, and forward and backward in time. All animals 
> are similarly based in the here and now.
> 
> But,if only in principle, networked computers [or robots] offer the 
> possibility for a conscious entity to be distributed and in several places 
> at once, seeing and interacting with the world simultaneously from many 
> POV's.
> 
> Has anyone thought about how this would change the nature of identity and 
> intelligence? 
> 
> 
> 
> 
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
> 
> 
> 
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription: https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
> 


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to