Even GPT-2 is truly an n-gram. Entailment is core. Mine uses long n-grams, up to 8 or even 10 words. No limit really, just need more compute. Mine also looks at all the text, fulfilling the long distance issue felt in past history of text generation. There's a lot to the algorithm but very explainable. Also code is 700 lines long. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tb2c56499bd62ee8a-Md5ad9c6649f3c680eb0f7109 Delivery options: https://agi.topicbox.com/groups/agi/subscription
- [agi] I am releasing all of my AGI research soon immortal . discoveries
- [agi] Re: I am releasing all of my AGI researc... rouncer81
- [agi] Re: I am releasing all of my AGI res... korrelan
- [agi] Re: I am releasing all of my AGI... immortal . discoveries
- Re: [agi] I am releasing all of my AGI researc... Stefan Reich via AGI
- Re: [agi] I am releasing all of my AGI res... immortal . discoveries
- Re: [agi] I am releasing all of my AGI... immortal . discoveries
- [agi] Hominization Secretary of Trades
- Re: [agi] I am releasing all of my AGI... Stefan Reich via AGI
- Re: [agi] I am releasing all of my... immortal . discoveries
