Martin C. Martin wrote: ... > But say that to most AI researchers, and they'll stare at you > uncomprehendingly. They want a well defined problem, such as using all > users purchases at Amazon to suggest other purchases for a single user.
A while back, a DARPA program manager (an agent person, at that), sent out the notice to his program that the textbook on agents that he wrote before moving to DARPA was available on Amazon. The beauty of this was the "people who purchased this" recommendations, which started with "Clean Underwear". He reported this and I subsequently checked and, sure enough, Amazon recommended that purchasers of his book would also like to purchase clean underwear. I suspect this was the default for something that had no purchasers, showing the sense of humour of the programmers. However, I have seen many other nearly as absurd recommendations from that type of AI. Clearly, the absurdity arises because they do not model the real world, just data mine blindly. Those recommendation systems clearly do not pass the Turing test. -- Ray Parks [EMAIL PROTECTED] IDART Project Lead Voice:505-844-4024 IORTA Department Mobile:505-238-9359 http://www.sandia.gov/scada Fax:505-844-9641 http://www.sandia.gov/idart Pager:800-690-5288 ============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org
