deep learning...logic...continuous...semantics

Let me guess, the continuous is for the deep learning neural layer function 
transformations that backprop is using for the gradient decent? So:

deep learning...logic...semantics

But how do you feed a gradient descent based algorithm SEMANTICS ???? 
Transformers use if I'm correct things like semantics, attention..... But what 
is the actual cheesy meat to AGI? Is it semantics and attention etc OR is it 
gradient descent ex. Backprop? Backprop does nothing but let the actual AI 
mechanisms do their job. So:

logic...semantics/attention/etc

And this logic, why where do this come from ?? AND OR NOT? A>B? A=B? A brain 
only uses  contextual windows of any length ex. predict the next word or a 
translate word based on the last 5/ or 4/ or n amount of words of the context 
given. Any higher complexity like ok look at this word 'cancer' in this 
sentence solo and analyze it or re read the sentence 5 times etc is just the 
original AI mechanisms working their thing... So:

semantics/attention/etc
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T54594b98b5b98f83-Me7a2305ca72c2f7114d9aef8
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to