The weighted sum is a dot product and is a weighted mean etc. There are quite
a few names.
In an ReLU neural network each weighted sum is followed by an ReLU activation
function.
When the output of the weighted sum is positive the ReLU switch is thrown on
and conducts. When it is negative the ReLU is thrown off and stops conducting.
The perceptual shift is to forget the activation function aspect and see the
neural network as a weighted sum (dot product) switchboard. Somewhat like in
an old telephone exchange.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T894f73971549b2ee-Meb93502b35ed7bf6f09ac20b
Delivery options: https://agi.topicbox.com/groups/agi/subscription