@Matt oh maybe I got it, is this way a good idea?:

Say I search for these contexts: "walking dow[n[ [t[h[e]]]]] ?" When I get 
matches, there is byte sized predictions, simply I gather all the first bits. 
Then upon predicting the next bit, I remove the bytes (predictions) that don't 
start with that bit, and tally up again the 2nd bits.

What do you think?

I couldn't imagine storing contexts like below so that I could get 
refined/dedicated predictions after outputing a new bit:
01110100 11[1001[01 [0000[110[0]]]]]
(where the windows are on bits, instead of limited to bytes only)
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf856e4082d9ea09a-M82f0dc4516fef27590e11ab7
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to