Matt what was the best compression you got from using just the last 6 letter 
context on enwiki8? 21MB? Or, 18MB?... ? And did you group related words? And 
how many models were mixed and what were they? Just want to know how low it can 
go *without doing the grouping related words*.

And do you know how Shelwien's Green algorithm mixes the 17 models? Please 
explain it "vividly"/clearly if you do. Shelwien seems to have failed 10 times 
explaining it to me lol. Maybe I can combine his plus mine.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tcfc4df5e57c62b43-M3d0ab750e3a0198ac4f0f5f0
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to