This sounds like imperil or something, are you top dog? Do you work on 
Transformers? I expect you to be able to code one from scratch.  I don't work 
on Transformers ATM because I believe I found a more direct/ explainable way to 
do the same things. I already have done half them, now a few more and I'll be 
quite confident. You can see my project here > 
https://encode.su/threads/3595-Star-Engine-AI-data-compressor

Same things see? Context matches instead of backprop, exponential functions 
done on fly instead of during backprop, recency boosting contexts, and a few 
others smaller thingies. I have yet to efficiently implement the holed matches 
(Dropout), delayed matches, and translate ability (seq2seq/ word2vec) to get 
more longer prompt matches, and then apply those to the recency boosting so it 
can see 'dog pig cat' and predict 'horse very likely'. No backprop. 
Transformers have all the same things, many tell me they don't know why those 
functions are there it is a black box but I still suspect the founders of 
Transformers knows why, but if not then it's cuz what I show above in my 
project, it is all context matching.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5417cc95d981211e-Maaf9439c1d282aa194f6233d
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to