I would not use any mathematical tool unless it perfectly fits the problem
to solve. I doubt tensor products can solve NLP. But I can be wrong. I
would rather use analogical correspondences via Markov models for NLP. But
I can be wrong too. :)


On Wed, Jun 18, 2014 at 2:16 PM, Jim Bromer via AGI <[email protected]> wrote:

> If all active words (verbs) had only one or two directions then a directed
> vector could be used. However, part of the value of words is that you can
> assign a variety of possible relations to them and a single standard will
> not work. Suddenly the space and the mathematics has to be warped to cover
> the variations. Vector Space and Semantics is just not a good match.
> Jim Bromer
>
> *If you can solve a problem by avoiding it then your attitude may be part
> of the problem.*
>
>
> On Wed, Jun 18, 2014 at 10:23 AM, Matt Mahoney via AGI <[email protected]>
> wrote:
>
>> The semantic vector of a sentence is approximately the sum of the word
>> vectors, not the product. It is not exact because it does not account
>> for word order. John + loves + Mary = Mary + loves + John.
>>
>> On Wed, Jun 18, 2014 at 8:47 AM, YKY (Yan King Yin, 甄景贤)
>> <[email protected]> wrote:
>> >
>> > Words or concepts can be extracted as vectors using Google's word2vec
>> > algorithm:
>> > https://code.google.com/p/word2vec/
>> >
>> > To express a complex thought composed of simpler concepts, a
>> mathematically
>> > natural way is to multiply them together, for example "John loves Mary"
>> =
>> > john x loves x mary.
>> >
>> > I'm wondering if forming the tensor products from word2vec vectors
>> could be
>> > meaningful.
>> >
>> > The tensor product is a bi-linear form (the most universal such
>> bi-linear
>> > mappings).  So it may preserve the linearity of the original vector
>> space
>> > (in other words, the scalar multiplication in the original vector
>> space).
>> > If the scalar multiplication is meaningful in the word2vec space, then
>> its
>> > meaning would be preserved by the tensor product.
>> >
>> > The dimension of the tensor product space is also much higher (as the
>> > product of the dimensions of the original spaces;  this is even greater
>> than
>> > the Cartesian product which is the sum of the dimensions of the original
>> > spaces.)  Computationally, I wonder what is the advantage of using
>> tensor
>> > products as opposed to Cartesian products...?
>> >
>> > Or perhaps the extra richness of tensor structure can be exploited
>> > differently...
>> >
>> > --
>> > YKY
>> > "The ultimate goal of mathematics is to eliminate any need for
>> intelligent
>> > thought" -- Alfred North Whitehead
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> Groups
>> > "Genifer" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> an
>> > email to [email protected].
>> > For more options, visit https://groups.google.com/d/optout.
>>
>>
>>
>> --
>> -- Matt Mahoney, [email protected]
>>
>>
>> -------------------------------------------
>> AGI
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed:
>> https://www.listbox.com/member/archive/rss/303/24379807-f5817f28
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>
>> Powered by Listbox: http://www.listbox.com
>>
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/23601136-e0982844> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to