Ben,
The difference is between you engineer it, using the inference in your brain, or it engineers itself, using its own inference. Humans, and animals, have their own inference. They engineer situations as they confront them, and prepare behaviors on the go the best they can, and they store them for later reuse, or just because they represent learned knowledge. When you engineer a brain (or AGI), you can not confront all possible situations. Later, when the brain confronts a different situation, you are not there to help. That's the difference between AGI and narrow AI. Second comment. Well, maybe I misread something you posted. I remember you were doing something with centers of gravity for Destin. Would you care to explain what it was? And what is Destin? Sergio From: Ben Goertzel [mailto:[email protected]] Sent: Tuesday, June 12, 2012 1:20 PM To: AGI Subject: Re: [agi] Representations and data structures I don't agree that engineering a system intrinsically constitutes using "narrow AI" ..... If so, then the human brain is also a "narrow AI", with its many specialized processes and structures, adapted to work well together... Regarding your comment Are you really trying to address grounding by way of geometric figures and centers of gravity? I have no idea what that means?? ben g -- Ben Goertzel, PhD http://goertzel.org "My humanity is a constant self-overcoming" -- Friedrich Nietzsche AGI | <https://www.listbox.com/member/archive/303/=now> Archives <https://www.listbox.com/member/archive/rss/303/18883996-f0d58d57> | <https://www.listbox.com/member/?& ad2> Modify Your Subscription <http://www.listbox.com> ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968 Powered by Listbox: http://www.listbox.com
