We already know what some of the most advanced people see as the most
valuable thing and most of us don't care about it ... why should that
change with AGI? How many people out there really care about their own
personal growth within our culture? AGI won't be able to do the job for
us just as you can not learn/grow on behalf of your children.
If the AGI system tells you to sit down and meditate and to "become
love" and to transcend this physical reality will you take it more
serious than if a Zen monk told you so? If so then why? Wouldn't you
rather believe that something went wrong and the AGI system can't
possibly be that smart if it only tells you to meditate and to calm your
mind? ;-) What do you expect an AGI system (that knows that there is no
shortcut to personal growth) to be telling you? That the ultimate goal
is to build a 5 parsec spanning pink unicorn made of computronium? To
create a gigantic Mindplex which already exists anyway? To run our own
redundant virtual realities within a virtual reality?
I have yet to hear about a genuinely interesting and holistic
expectation ... anyone? :-)
On 15.11.2014 16:54, Stanley Nilsen via AGI wrote:
I am curious though at how one determines what constitutes "reward."
Part of my interest in watching AI is because I want to know what the
smartest "unit" in the world discovers, or considers, to be the
ultimate reward / value.
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com