Are you asking what the left and right vectors mean in general in the SVD?

S is a re-expression of the original matrix's transformation, but in a
different and more natural basis. (Actually it's an approximation, since
small singular values are tossed out, and the rank of S is therefore much
smaller since those 0s might as well not exist.)  So U and V are really
change-of-basis transformations -- transforming into S's world of basis
vectors and back out again.

In more CF-oriented terms, S is an expression of pseudo-users' preferences
for pseudo-items. And then U expresses how much each real user corresponds
to each pseudo-user, and likewise for V and items.

To put out a speculative analogy -- let's say we're looking at users'
preferences for songs. The "pseudo-items" that the SVD comes up with might
correspond to something like genres, or logical groupings of songs.
"Pseudo-users" are something like types of listeners, perhaps corresponding
to demographics.

Whereas an entry in the original matrix makes a statement like "Tommy likes
the band Filter", an entry in S makes a statement like "Teenage boys in
moderately affluent households like industrial metal". And U says how much
Tommy is part of this demographic, and V tells how much Filter is industrial
metal.

(Unfortunately, the SVD doesn't tell you these interpretations, and
interpretations of S are rarely so neat as in this made-up analogy.)



On Mon, Nov 22, 2010 at 7:52 AM, Lance Norskog <[email protected]> wrote:

> This post is inspired by this tutorial, which talks about interpreting
> the U and V matrices:
>
>
> http://www.puffinwarellc.com/index.php/news-and-articles/articles/30-singular-value-decomposition-tutorial.html
>
> Given a DataModel that generates preferences between all Users and all
> Items, lets take two Users and three Items:
>     I      I     I
> U 0.5  0.2  0.1
> U 0.8  0.3  0.2
>
> What can we learn from an SVD factorization?
>
> SVD gives 3 matrices and a scalar: U, a singular value matrix that
> signifies the actual rank of the matrix, and transpose(V). For
> simplicity, do the 1-dimensional factorization, which gives left and
> right vectors instead of scalars. Ignoring the scaling matrix, we get
> the Left and Right singular vectors.
>
> The Left Singular Vector is: (column A x rows U1 and U2)
>
> ____A__
> U1
> U2
>
> The Right Singular Vector is: (row B x columns I1, I2, and I3)
>
> ___I1___I2___I3
> B
>
> Now, the question: what do the Left and Right vectors encode? What do
> column A and row B mean?
>
> --
> Lance Norskog
> [email protected]
>

Reply via email to