OK so word2vec actually is just comparing 2 vectors to see 2 words relate, it is not learning words near each other are related.....So then my way is novel and I will be laughing soon, no one has thought of how to do this my way haha.
And the idea I had for predicting things seen nearby a word ex. "landed, the ship had the crew" then see "Our new crew, __" and predict rare-ish words seen near crew las time ex. landed, in a different order, seems, useful, still...These are not always related words though they have a good probability to find you related words too! The main use is regurgitation to get answers to questions right by predicting items seen around it with some probability. So to store these hmm, it would make memories, you could store a tree net that says if you see the word 'crew', predict (no order is stored): landed, ship, moon, mars, rocket, booster, NASA, astronauts, earth, planets, rocks, gas, fuel, space, nebula, etc, all learnt from each occurrence of 'crew'. by looking nearby each occurrence. You could store a long sparse context in a trie tree...just the rare-ish topic words ex. "dog cat kibble my cat ate food with me on the couch", where the start 'end' has no order, all last 3 words are not to match, however this limits the view, and costs a lot still to store order. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T192296c5c5a27230-M99a245f453cde777f5c5f44d Delivery options: https://agi.topicbox.com/groups/agi/subscription
