For anyone still wondering why we need to compress data if we are making an AI 
brain, let me explain again. Lossless Compression let's you compress 100MB to 
14.8MB (this has been done using a wikipedia dataset), and it decompresses it 
back using a predictor for the next bit/letter/word/phrase to re-generate. 
There is patterns in the 100MB data, like both dogs and cats eat, drink, sleep, 
etc, so it can group words and compress better. This let's you re-generate not 
just the full 100MB back, but other, related, data. So it enables AI to 
understand the data/patterns and generate future discoveries that actually 
entail its questions. Also translation (first it recognizes the context, before 
it predicts Next Letter, so this is that process used for prediction; 
translation, ex. my cats eat = my dogs ?_?).
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T65747f0622d5047f-Mcbc1fc9f4a03d15e6d80419d
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to