Answering questions IS predicting the next letter, or word. Or bit. It just 
doesn't have conversation training data unless u give it that, ad doesn't know 
when to stop talking or how to listen to you ex. if you ask it to "summarize".

https://workupload.com/file/jmBb5Afbgmt

This is Shelwien's Green, mine is slower but the algorithm is nearly the same. 
You can train it on ex. 1GB of wikipedia (use enwik9 dataset then, not enwik8) 
but will take 1 day. More data = better prediction. It however won't talk that 
good. You can try the world's best compressor, get it from the Hutter Prize 
(well, Matt's benchmark....it can get enwik8 to 14.8MB compressed).

You can try GPT-2 too, it's online, no download needed.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T30bb2d96ab6bb993-Mb401ac38735e99c754228e2b
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to