RE: [agi] Early Apps.

2002-12-28 Thread Ben Goertzel
Gary Miller wrote: *** I guess I'm still having trouble with the concept of grounding. If I teach/encode a bot with 99% of the knowledge about hydrogen using facts and information available in books and on the web. It is now an idiot savant in that it knows all about hydrogen and nothing about

RE: [agi] Early Apps.

2002-12-28 Thread Gary Miller
Ben Goertzal wrote: I don't think that a pragmatically-achievable amount of formally-encoded knowledge is going to be enough to allow a computer system to think deeply and creatively about any domain -- even a technical domain about science. What's missing, among other things, is the intricate

[agi] Thinking may be overrated.

2002-12-28 Thread Kevin Copple
Perhaps thinking is overrated. It sometimes seems the way progress is made, and lessons learned, is predominately by trial and error. Thomas Edison's light bulb is good example, especially since it is the very symbol of idea. From what I know, Edison's contribution was his desire to make the