By "short term memory" I mean the working train of thoughts, looping, like 
GPT-2's Attention system in the Transformer architecture. Paying attention to 
context updates the attention weights, then it iterates, having a new context. 
So while Glove equalizes, Attention equalizes too. Your though-train comes to a 
decision, that is satisfactory in standing to your knowledgebase which governs 
your Attention.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T41ac13a64c3d48db-M24646bbd7571cdeb5a646790
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to