It's actually possible my "word2vec" I made is as efficient and accurate IF I 
only store the top 5,000 relations to each other, instead of all 
50k<>50K.....perhaps word2vec gives all an embed but each embed is not as 
dimensionally long as could be and so each is suffering due to not being able 
to have ex. 10,000 dimensions.

The main thing about attempting my code is that it may be simpler to work with. 
If GPT cannot be made ~400 lines of Python, then it might be a overly complex 
algorithm.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tc124b3d00b83e897-M4c0e71ab19fbea236e56ada5
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to