That use of the term "vector" is confusing. But, in my opinion it is also
sometimes used pretentiously. In a typical neural network the direction of an
input or an output to a node is not encoded in the input or output itself. But
the input is coming from some other node or nodes and the output is going to
some other node or nodes, so calling them vectors is almost like a poetic
reference that can help students remember that they are coming from some other
nodes and they are going to other nodes, The term "vector" can be used to
distinguish them from the weights or bias or something like that. But for a
person who is trying to understand what it is that the ANN and DL people are
talking about this casual - poetic - in crowd lingo - and often pretentious use
of the term is annoying. I am not saying that Matt is annoying or pretentious
(I would never say that) but the fact is that even someone who has followed the
development of this stuff for some time can fall into this critical linguistic
parallel-universe is evidence of just how serious the problem is.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T68be2fedf1f53ef2-Md9bc34266fcee93309f44909
Delivery options: https://agi.topicbox.com/groups/agi/subscription