@dlesnoff

You are correct that there is more than enough to train a general AI.

The catch is that GPT is not a general AI. It does not recognize patterns and 
algorithms. It simply predicts words based on the previous words based on a 
high-volume of training data. To turn that in to useful code generation it 
needs not just Rosetta Code to be good, but for millions of people to have code 
examples subtly similar to Rosetta Code but published in other projects. It 
needs sheer volume.

I'm actually impressed that it is working on the small sample size it has. I 
suspect that it is boosted by the non-nim languages. Basically it sees nim more 
as a dialect of word prediction of known sequences seen in other languages.

Reply via email to