Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Brad Paulsen
Charles, I don't think I've misunderstood what Turing was proposing. At least not any more than the thousands of other people who have written about Turing and his test over the decades: http://en.wikipedia.org/wiki/Turing_test http://www.zompist.com/turing.html (Twelve reasons to toss the

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Eric: Yes. An electronic mind need never forget important facts. It'd enjoy instant recall and on-demand instantaneous binary-precision arithmetic and all the other upshots of the substrate. On the other hand it couldn't take, say, morphine! It would though, presumably, have major problems

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Ben:but, from a practical perspective, it seems more useful to think about minds that are rougly similar to human minds, yet better adapted to existing computer hardware, and lacking humans' most severe ethical and motivational flaws Well a) I think that we now agree that you are engaged in a

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
On Sun, Aug 10, 2008 at 9:02 AM, Mike Tintner [EMAIL PROTECTED]wrote: Ben:but, from a practical perspective, it seems more useful to think about minds that are rougly similar to human minds, yet better adapted to existing computer hardware, and lacking humans' most severe ethical and

Re: [agi] brief post on possible path to agi

2008-08-10 Thread rick the ponderer
On 8/10/08, Matt Mahoney [EMAIL PROTECTED] wrote: rick the ponderer [EMAIL PROTECTED] wrote: Regarding cempeting to buy information - I'm not suggesting that at all, people would be competing to sell the services of their classifier (and shopping around for the best classifier to consume or

Re: [agi] brief post on possible path to agi

2008-08-10 Thread rick the ponderer
On 8/10/08, rick the ponderer [EMAIL PROTECTED] wrote: On 8/10/08, Matt Mahoney [EMAIL PROTECTED] wrote: rick the ponderer [EMAIL PROTECTED] wrote: Regarding cempeting to buy information - I'm not suggesting that at all, people would be competing to sell the services of their classifier

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Ben, Obviously an argument too massive to be worth pursuing in detail. But just one point - your arguments are essentially specialist focussing on isolated anatomical rather than cognitive features, (and presumably we (science) don't yet have the general, systemic overview necessary to

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
Agree that the human mind/brain has evolved to work reasonably effectively in a holistic way, in spite of the obviously limitations of various of its components... To give a more cognitive example of a needless limitation of the human mind: why can't we just remember a few hundred numbers in

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread William Pearson
2008/8/10 Mike Tintner [EMAIL PROTECTED]: Just as you are in a rational, specialist way picking off isolated features, so, similarly, rational, totalitarian thinkers used to object to the crazy, contradictory complications of the democratic, conflict system of decisionmaking by contrast with

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread wannabe
Interesting conversation. I wanted to suggest something about how an AGI might be qualitatively different from human. One possible difference could be an overriding thoroughness. People generally don't put in the effort to consider all the possibilities in the decisions they make, but computers

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
And I've said it before, but it bears repeating in this context. Real intelligence requires that mistakes be made. And that's at odds with regular programming, because you are trying to write programs that don't make mistakes, so I have to wonder how serious people really would be about

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread wannabe
me: And I've said it before, but it bears repeating in this context. Real intelligence requires that mistakes be made. And that's at odds with regular programming, because you are trying to write programs that don't make mistakes, so I have to wonder how serious people really would be about

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
yes. This is one of the reasons why I like virtual world and game AI as a commercial vehicle for the popularization and monetization of early-stage AGI's. No one cares that much if a game AI occasionally does something dumb. It may even be considered charmingly funny. Much more so than if the

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
On Sun, Aug 10, 2008 at 5:52 PM, Mike Tintner [EMAIL PROTECTED]wrote: Will, Maybe I should have explained the distinction more fully. A totalitarian system is one with an integrated system of decisionmaking, and unified goals. A democratic, conflict system is one that takes decisions with

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Ben:By true rationality I simply mean making judgments in accordance with probability theory based on one's goals and the knowledge at one's disposal. Which is not applicable to AGI prob lems, which are wicked and ill-structured, and where you cannot calculate probabilities, and are not sure of

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
Or even simpler problems, like : how were you to handle the angry Richard recently? Your response, and I quote: Aaargh! (as in how on earth do I calculate my probabilities and Bayes? and which school of psychological thought is relevant here?) Now you're talking AGI. There is no rational or

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread John LaMuth
- Original Message - From: Ben Goertzel To: agi@v2.listbox.com Sent: Sunday, August 10, 2008 8:00 AM Subject: Re: [agi] The Necessity of Embodiment ... the best approaches are 1) wait till the brain scientists scan the brain well enough that, by combining appropriate

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
I am aware of textbook neuroscience, but it really does not tell you enough to let you emulate the brain. Neuroanatomy plus single-neuron understanding is not enough... OCP cannot benefit directly from detailed neuroscience knowledge as it is a different sort of AGI system ... but a closely

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread John LaMuth
I definitely agree w/ you, being that no-one is sure where even memory is stored in the brain, much less thought, consc. etc. Gross anatomy does prove useful when examining established input/output systems modalities along the lines of a black box model of the brain, wherein providing an