I think you can skip the heterarchy maybe....simply the hierarchy nodes get 
activated ex. nodes cat, etc, which parallelly leaks energy to their nearby 
context ex. 'the cat ate' 'the cat ran' 'our cat went' and these handles leak 
energy to nearby context 'the dog ate' 'the dog ran' 'some dog went' and so on 
for proving 'some'='our' as well, including if cat=zebra/horse and 
dog=zebra/horse then cat=dog! Hence no w2V, just on the fly activated by 
leaking connections. Solves typos, rearranged phrases, unknown words ex. 
superphobiascience, alternative words, related words, names, references ex. 
it/he, and blanks. Then for the candidate words, the winner is the one that is 
most frequent in knowledge (has energy) and in Working Memory Activation 
Context (which fade energy/leak), most related to story word (activation leak), 
most favorite (has energy). This is how to recognize/understand a window where 
you look/how wide, then which candidate Next Word to choose, then you may also 
adapt it by translating it too.

It can be run faster to do it other ways but to understand it can be easier by 
other ways like this.
Artificial General Intelligence List: AGI
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to