Even GPT-2 is truly an n-gram. Entailment is core.
Mine uses long n-grams, up to 8 or even 10 words. No limit really, just need 
more compute.
Mine also looks at all the text, fulfilling the long distance issue felt in 
past history of text generation.
There's a lot to the algorithm but very explainable. Also code is 700 lines 
long.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tb2c56499bd62ee8a-Md5ad9c6649f3c680eb0f7109
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to