In other words, Vladimir, you are suggesting that an AGI must be at some
level controlled from humans, therefore not 'fully-embodied' in order to
prevent non-friendly AGI as the outcome.

Therefore humans must somehow be able to control its goals, correct?

Now, what if controlling those goals would entail not being able to create
an AGI, would you suggest we should not create one, in order to avoid the
disastrous consequences you mentioned?

Valentina



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to