In response to something I said about cross-generalization, John Rose replied " 
You can optimally compress some of the data all of the time, you can optimally 
compress all of the data some of the time, but you can’t optimally compress all 
of the data all of the time. It is what it is bruh."

Generalization is not as narrowly defined as you seem to think. This topic is 
about making AI from a compressor, so I started thinking about 
cross-generalization as a compressor. Cross-generalization is a network theory, 
not just something to do with uniform horizontal and vertical relations or a 
filing cabinet system or something that was tightly constrained in that way. 
The term generalization includes variations of generality that would include  
non-optimal compression and cross-topical compression and so on.

I do not think that "compression" per se is the basis of making AI (which is 
directly related to the topic). However, I do believe that an AGI (or an 
advanced AI) program would like a compressor. I am also thinking of an 
Artificial Artificial Neural Network, but I do not use that term literally. I 
want to develop a discrete network that can create and include ANN-like 
encodings. So the idea that I am trying to develop is that there could be a 
network that could be traversed and lead to direct insights but which would 
could also act like ANNs.  So I do not spend much time on general optimal 
compressors as the basis for an AI device.  I am thinking about specialized 
relations that will act to compress concepts and parts of concepts and so on.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tbdfca102d702de94-M9251b32ea731788115cc09c9
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to