No examples of these miracle sub-progs? Wotta surprise. AI-ers generally abuse the word “novel” in their descriptions of what their programs can do – so much so that the hype has become almost reflex. But these progs never do meet true novel challenges.
Ben made a similar claim to yours not so long ago re some kind of common mapping program. On analysis, he had to agree that it was not capable of finding new paths or territory it had not been primed to look for;. On analysis, these novel claims are always B.S. From: Jim Bromer Sent: Sunday, December 16, 2012 9:17 PM To: AGI Subject: Re: [agi] Internal Representation On Sun, Dec 16, 2012 at 2:12 PM, Mike Tintner <[email protected]> wrote: You can only have an algo for a journey if you have already made the journey, and can predict the territory, – and have a route[s]-map. (That incl. “ replotting alternative ways of getting there” after meeting unforeseen obstacles). By extension, you can only have problems of complexity if you already have a known set of possible journeys/route-maps to consider (a la Travelling Salesman Problem).. That’s why algos only work in artificial controlled environments like factories, labs and warehouses, which are fully known and have been artificially structured to be predictable.. Sub programs are able to operate to deal with novel situations like finding new paths through territory which wasn't fully known pr artificially structured to be predictable. Jim Bromer AGI | Archives | Modify Your Subscription ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
