Kaj,

On 5/6/08, Kaj Sotala <[EMAIL PROTECTED]> wrote:
>
> Certainly a rational AGI may find it useful to appear irrational, but
> that doesn't change the conclusion that it'll want to think rationally
> at the bottom, does it?


The concept of rationality contains a large social component. For example,
the Eastern concept of "face" forces actions there that might seem to us to
be quite irrational. Polygamy works quite well under Islam, but fails here,
because of social perceptions and expectations. Sure, our future AGI must
calculate these things, but I suspect that machines will never understand
people as well as people do, and hence will never become a serious social
force.

Take for example the very intelligent people on this forum. We aren't any
more economically successful in the world than people with half our our
average IQs - or else we would be too busy to make all of these postings. If
you are so smart, then why aren't you rich? Of course you know that you have
directed your efforts in other directions, but is that path really worth
more *to you* than the millions of dollars that you may have "left on the
table"?

The whole question of goals also contains a large social component. What is
a LOGICAL goal?!

Steve Richfield

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com

Reply via email to