That Hutter Prize thing I did really taught me tons about AGI. It for example 
introduced/ reinforced into me that neural connections strengthen by accesses, 
and it does this for all layers / mixes / lengths of the sentence as travels 
upward to know how many times it seen each of ex. [t[h[e[ [c[a[t]]]]]]] 
strings. And Byte Pair Encoding for up to phrase level should help window 
attention and prediction candidate attention and is also required for recent 
activity, this weight alignment is much more potent in the mix, it mostly gives 
weight to word and BPE-phrase windows's predictions ex. [and then [the [cat]]. 
If it sees lots of windows that aren't confident in predictions, it tries to 
look at it differently by ex. looking at part-of-word level using Backoff.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T3cd584667cb2384b-Mbfd772d1cea42042007d237f
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to