2008/9/8 Benjamin Johnston <[EMAIL PROTECTED]>: > > Does this issue actually crop up in GA-based AGI work? If so, how did you > get around it? If not, would you have any comments about what makes AGI > special so that this doesn't happen? >
Does it also happen in humans? I'd say yes, therefore it might be a problem we can't avoid but only mitigate by having communities of intelligences sharing ideas so that they can shake each other out of their maxima assuming they settle in different ones (different search landscapes and priors help with this). The community might reach a maxima as well, but the world isn't constant so good ideas might always be good, changing the search landscapes, meaning a maxima my not be a maxima any longer. Will ------------------------------------------- agi Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51 Powered by Listbox: http://www.listbox.com