To get any further with "feelings" you again have to have a better definition 
and examples of what you are dealing with.

In humans, most "feelings" and emotions are brought about by chemical changes 
in the body yes?  Then from there it becomes "knowledge" in the brain, which we 
use to make decisions and react upon.

Is there more to it than that?  (simplified overview)

Simply replacing the chemical parts with machine code easily allows an AGI to 
feel most of these feelings.  Mechanical sensors would allow a robot to 
"feel"/sense being touched or hit, and a brain could react upon this.  Even a 
simulated AGI virtual agent could and does indicate a prefence for Not being 
shot, or being in pain, and running away, and could easily show preference 
"like"/feeling for certain faces or persons it find 'appealing'.  
   This can all be done using algorithms, and learned / preferred behavior of 
the bot with no mysterious 'extra' bits needed.

Many people have posted and argue the ambiguous statement:
  "But an AGI cant feel feelings."
I'm not really sure what this kind of sentence means, because we cant even say 
that or how "humans feel feelings"
  If we can define these in some way that is devoid of all logic, and has 
something that an AGI CANT do, I would be interested.

An AGI should be able, and will benefit from having feelings, will act reason, 
and believe that it has these feelings, and will give it a greater range of 
abilities later in its life cycle.

James Ratcliff

Mark Waser <[EMAIL PROTECTED]> wrote: >>Your brain can be simulated on a 
large/fast enough von Neumann 
>>architecture.
> From the behavioral perspective (which is good enough for AGI) - yes,
> but that's not the whole story when it comes to human brain. In our
> brains, information not only "is" and "moves" but also "feels".

It's my belief/contention that a sufficiently complex mind will be conscious 
and feel -- regardless of substrate.

> It's meaningless to take action without feelings - you are practically
> dead - there is just some mechanical device trying to make moves in
> your way of thinking. But thinking is not our goal. Feeling is. The
> goal is to not have goal(s) and safely feel the best forever.

Feel the best forever is a hard-wired goal.  What makes you feel good are 
hard-wired goals in some cases and trained goals in other cases.  As I've 
said before, I believe that human beings only have four primary goals (being 
safe, feeling good, looking good, and being right).  The latter two, to me, 
are clearly sub-goals but it's equally clear that some people have 
mistakenly raised them to the level of primary goals.

>> If you can't, then you must either concede that feeling pain is possible 
>> for a
>> simulated entity..
> It is possible. There are just good reasons to believe that it takes
> more than a bunch of semiconductor based slots storing 1s and 0s.

Could you specify some of those good reasons (i.e. why a sufficiently 
large/fast enough von Neumann architecture isn't sufficient substrate for a 
sufficiently complex mind to be conscious and feel -- or, at least, to 
believe itself to be conscious and believe itself to feel 
nasty thought twist? :->)?


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;



_______________________________________
James Ratcliff - http://falazar.com
Looking for something...
 
---------------------------------
 Get your own web address.
 Have a HUGE year through Yahoo! Small Business.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to