[agi] Re: Attention is All you Need

2021-05-13 Thread immortal . discoveries
Mom asks which one does she use, explorer, edge, ?. This primes the domain. Now I know what she talking about. I say chrome because it is primed and is the one she uses. I have already two of these mechanisms implemented. Translation will be last. --

[agi] Re: Attention is All you Need

2021-05-10 Thread keghnfeem
The embedding are a number given to a word that identifies the object. This number is given by a supervisor. Example the word man is given a 0.5, a woman is given a 0.51. A male monkey is given 0.3. A female monkey is 0.31.  The vector of change from monkey to man is around 0.2.   The distance

[agi] Re: Attention is All you Need

2021-05-10 Thread Jim Bromer
I appreciated the links to the transformers. I found a slightly more readable and see that the first step of transformer use in nlp is to turn words into embedding and positional vectors that indicate more than just co-occurrence. I appreciate that. But then the phraseology becomes confused

[agi] Re: Attention is All you Need

2021-05-08 Thread keghnfeem
DINO: Emerging Properties in Self-Supervised Vision Transformers (Facebook AI Research Explained): https://www.youtube.com/watch?v=h3ij3F3cPIk=432s -- Artificial General Intelligence List: AGI Permalink:

[agi] Re: Attention is All you Need

2021-05-06 Thread keghnfeem
Transformer Neural Networks - EXPLAINED! (Attention is all you need): https://www.youtube.com/watch?v=TQQlZhbC5ps -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tefaeb8e790a54cec-Mb473d37c74d07b8ac45342a3

[agi] Re: Attention is All you Need

2021-04-21 Thread Jim Bromer
Shifting *local* windows for visual processing were emphasized in the video. It really gets me to think about how these can be applied to other AI applications. -- Artificial General Intelligence List: AGI Permalink:

[agi] Re: Attention is All you Need

2021-04-21 Thread Jim Bromer
That explains a lot. The link that I sent, DeepMind x UCL | Deep Learning Lectures | 8/12 | Attention and Memory in Deep Learning - YouTube  , showed attention and window shifts but I was not able to fully integrate that into my thinking about

[agi] Re: Attention is All you Need

2021-04-20 Thread keghnfeem
Will Transformers Replace CNNs in Computer Vision? + NVIDIA GTC Giveaway: https://www.youtube.com/watch?v=QcCJJOLCeJQ -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tefaeb8e790a54cec-Me5095e2a2845fc9c283c0278

[agi] Re: Attention is All you Need

2021-04-20 Thread Jim Bromer
When I said that ANNs used linear approximations you knew what I meant because 'you are in the club.' But a newbie might have been confused and thought something like, "So that's how Neural Networks work. They use linear approximations." Seeing this I will try to find better phrases like - they

[agi] Re: Attention is All you Need

2021-04-20 Thread Jim Bromer
Transformer Attention does seem to be more than just those two fundamental points. I do not want to spend a lot of time working with NNs (other than on my TinyML projects) but I do want to get a better understanding about how these things work and then apply some of the ideas to some slightly

[agi] Re: Attention is All you Need

2021-04-19 Thread immortal . discoveries
To me, this "Transformer Attention" is just the 2 things I explain: 1) Recent letters/ words/ etc are made more probable, so if cat>runs more than cat>sleep, predict cat>RUNS more often or with higher probability, but if you saw recently sleep 3 times, you predict sleep much more than runs,

[agi] Re: Attention is All you Need

2021-04-19 Thread Jim Bromer
I have been watching this video. I can intuitively follow most of what he is saying. DeepMind x UCL | Deep Learning Lectures | 8/12 | Attention and Memory in Deep Learning - YouTube -- Artificial General