I haven't heard any really good criticisms of my AGI theories but I did ask you 
to wait until I was finished writing the summary. One reasonable criticism is 
that my program would not be fast enough. I already knew that.  I have, in the 
recent past, explained that I want to make a demo of a limited AGi program 
within the next year.  My goal is to get beyond other AGI programs of our time. 
I have some ideas that I believe should help me to make a minor advancement in 
the field.  However, I probably won't be able to get it done in a year.  I 
would have to work full time on it.  But I haven't solved the problems of AGI 
complexity. A number of criticisms have been about the style of my presentation 
or about using the wrong word.  There is nothing wrong with those criticisms 
and I appreciate having the editorial contributions but those kinds of comments 
are not central to the subject matter. Only one good question stands out in my 
mind.  Even if the text program knew something about cats would it be able to 
infer that cat's pounce if the necessary information was not in the program.  
The trouble with this as a criticism is that the issue is valid for all cases 
of mentation.  Does anyone who participates in this group know if a mountain 
lion purrs?  Does a mountain lion meow?  Does a kangaroo make some kind of 
vocalization?  Most of you do not know the answer to the questions off hand.  
Yes you might try making some inferences based on what you do know but that 
does not mean that your initial inferences would be correct.  However, that 
does not mean that they are incapable of intelligence. So the question is not 
whether a text based AGI program would be able to infer that a cat pounces but 
about how it might check its attempt to make inferences.  And my answer is that 
it would have to use some sort of trial and error method to see what kind of 
response it would get from its tentative inferences and to see whether those 
inferences could be used to explain things that it might observe in the IO 
fields. No AGI program is going to work perfectly.  But no intelligent being in 
our material world works perfectly. The difference is one of degree of 
aptitude.Jim Bromer                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to