The attempt to use vectors to describe word concepts is an old idea that
really does not work. Word-Concepts do not fit into 3-dimensional space as
well as something like aircraft, rockets and weather patterns do. The goal
of this kind of thing is that if it did work you could use all sorts of
efficient mathematical methods on it. The problem is how do you assign the
vector of the semantics of a phrase to a point in space. You could of
course use n-dimensional space, but this won't solve the essential problems
of accurately relating the meaning of a phrase with that space. This is a
critical problem and the question of  Cartesian products vs tensor products
is not going to get you where you want to go.

To some extent these simple functional relationships are useful. the x y z
where x is John and z is Mary and y is love is interesting just because it
does describe a general relationship that does have meaning which can be
used with so many other cases and, significantly, can be used to
build other *forms* as well. But, the effort to take this much further with
vector mathematics is really a miss-the-mark mistake.

Jim Bromer

*If you can solve a problem by avoiding it then your attitude is part of
the problem.*


On Wed, Jun 18, 2014 at 8:47 AM, YKY (Yan King Yin, 甄景贤) via AGI <
[email protected]> wrote:

>
> Words or concepts can be extracted as vectors using Google's word2vec
> algorithm:
> https://code.google.com/p/word2vec/
>
> To express a complex thought composed of simpler concepts, a
> mathematically natural way is to multiply them together, for example "John
> loves Mary" = john x loves x mary.
>
> I'm wondering if forming the tensor products from word2vec vectors could
> be meaningful.
>
> The tensor product is a bi-linear form (the most universal such bi-linear
> mappings).  So it may preserve the linearity of the original vector space
> (in other words, the scalar multiplication in the original vector space).
>  If the scalar multiplication is meaningful in the word2vec space, then its
> meaning would be preserved by the tensor product.
>
> The dimension of the tensor product space is also much higher (as the
> product of the dimensions of the original spaces;  this is even greater
> than the Cartesian product which is the sum of the dimensions of the
> original spaces.)  Computationally, I wonder what is the advantage of using
> tensor products as opposed to Cartesian products...?
>
> Or perhaps the extra richness of tensor structure can be exploited
> differently...
>
> --
> *YKY*
> *"The ultimate goal of mathematics is to eliminate any need for
> intelligent thought"* -- Alfred North Whitehead
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/24379807-f5817f28> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to