I didn't implement word gaps. All of these results are for contiguous contexts like you would get with PPM or BWT except that the word models skip over characters between the previous and current word. Once I do implement gaps that skip whole words, I expect compression to improve.
-- Matt Mahoney, [email protected] On Thu, Dec 18, 2025, 10:53 PM <[email protected]> wrote: > Wait. In my algorithm, gaps helps a ton. Gaps bring it down from 25,000 > bytes to 23,800. How much does it help in your algorithm? Ex. "The _ cat" > == "The large cat" / etc etc. > > Also do you do this? > Right now I'm starting to get a tiny result for > using Old Past predictions to vote on the current predictions. Ex. [The > cat] was on the floor, I got up and heard a "meow". (meow is predicted by > "I heard a", but "The cat" also predicts meow and that votes on "meow" to > increase its strength). > *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + > delivery options <https://agi.topicbox.com/groups/agi/subscription> > Permalink > <https://agi.topicbox.com/groups/agi/Tf0bedfcd44454678-M8d0f5777e81da1bd503cdb21> > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tf0bedfcd44454678-Ma9b68db5575802bbe27a39c2 Delivery options: https://agi.topicbox.com/groups/agi/subscription
