I didn't write the gpt2 training code, just found the link. The goal is to test 
on small datasets so I can compare it to my AI trained on the same dataset, 
since I won't be using chained expensive GPUs any time soon. I am however 
writing my AI python code in the editor and don't use Blockly anymore, my 
project is in an older recent post, you most likely already saw it.

One error, my AI would not take 1 min to train on 6MB, but actually 3.7 seconds 
if was in C++. GPT2 above took as said 1.2 hours on a GPU, I used CPU 1 core. 
Mine also generates extremely fast, which will be needed as it talks to itself 
and stores thoughts - GPT2 can take minutes, mine can generate 10,000 letters 
per second in python and store them. My AI may however start taking longer for 
my next plans.

"*He was driving* out into the cold and pawns from the protection of incorrect 
scatter residents now." This, not even 'driving out', are in The Brown Corpus. 
There are some 3 word matches here, but it is doing fairly fine. Mine actually 
did that really, it said '*joe will nee*d for microphone plays, and it in 
effers rareless and could not go that's no and one attached to his', and in 
TBC.txt is only 'phone play', not 'phone plays', and no 'microphone play' 
either, just 'microphone', and only 'for micro', no 'for microphone'.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tb76967774bc4d8c2-Mcfe8d899c37434f391a313d3
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to