I have spent about 12 hours on my AI / AGI project so far this year, so I
decided to end my self-imposed ban from the group since participating in
this group seems to motivate me to work on my project.

I think that I can use language to instruct an AI program to learn about
the things that I can communicate through language. The program would be
able to pick up factoids via simple language (similar to regular strings
and context-free strings) but it would also be able to try to utilize
knowledge and to acquire more insight through more complicated language
(similar to context-sensitive strings). For example, (an abstract example),
if it was trying to understand something that did not conform to some facts
that it was familiar with (using higher level sentential structures that it
was familiar with) it could use progressively more simple sentences to try
to fit the new facts into the knowledge that it had previously acquired.
Since I would be able to detect this, I, (as programmer-teacher), should be
able to be able to make a good guess about the kind of knowledge that
it might use effectively at that point.

One problem is that word-concepts may change, not only in their application
but in the level of abstraction and particularization. A word or a phrase
can even be both more particular and more general at the same time. An
exemplar is an example of this. For a program to benefit from this kind of
abstraction-generalization polymorphism (many shapes-not OOP polymorphism)
it has to be capable of both context-free (like) communication and
context-sensitive (like) communication. It has to be able to plug new ideas
into preexisting knowledge in simple ways, but then it has to try to derive
some guesses about these new ideas in more sophisticated ways.

This is a very simple explanation of how my program should work. I feel
that AGI has failed just because knowledge is too complicated. But, I want
to show that a basic strategy which includes the kind of thing that I
mentioned here makes sense and it should work - at least until the
knowledge base becomes too complicated.
Jim Bromer



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to