On 3/13/07, YKY (Yan King Yin) <[EMAIL PROTECTED]> wrote:
1. re #4: As an example, the logical term "chair" is defined, as a logical rule, by other logical terms like "edges", "planes", "blocks", etc. Sensory perception is a process of *applying* such rules; algorithmically this is known as *pattern matching*: we're given a set of low-level features (edges etc) and we need to search for (match) the description of a chair. The computational bottleneck here is that there can be a huge number of objects-to-be-recognized, such as chair, table, car, human,... a gadzillion things. This classic problem is *already* addressed by the rete algorithm.
Right. More generally we do need actual algorithms to handle problems like this, pure logical deduction isn't enough. 2. At the very lowest level (from pixels to edge detection, blob detection,
etc) I think we must use neural-like, specialized algorithms.
Yep, and not just at the very lowest level either; there'll be lots of situations where specialized algorithms (some reminiscent of neurons, some not) will be appropriate. Using a logical representation is not practical here.
This doesn't follow! An input from a video camera to an edge detection algorithm, for example, can consist of 15000000 logical assertions about the RGB values of the pixels; the edge detection code will probably be cleaner and more modular, and certainly easier to integrate with the rest of the system, when the input is in this form. (Later when optimizing for speed we might want this to get compiled to an array of floats representation - but this should be a compiler flag, it should make no difference to the semantics.) But we lose nothing, because this level is sub-conscious, like the retina is
to the brain.
But just because low-level visual processing isn't consciously accessible to humans doesn't mean the same must be true of an AI. Computers have lots of weaknesses compared to the human brain, but they also have some strengths and we should take advantage of those. ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=303
