It might make a difference that you're just running 1 iteration. Normally
it's run to 'convergence' -- or here let's say, 10+ iterations to be safe.

This is the QR factorization of Y' * Y at the finish? This seems like it
can't be right... Y has only 5 vectors in 10 dimensions and Y' * Y is
certainly not invertible. I get:

   1.20857  -0.20462   0.08707  -0.16972   0.17038   0.00342   0.24459
 -0.23287   0.51142  -0.06083
   0.00000   1.13242   0.23155   0.24354   0.32995   0.47781  -0.02832
0.43071  -0.24968   0.41470
   0.00000   0.00000   0.91070   0.37732   0.05296   0.39886  -0.62426
0.07809   0.53891   0.24877
   0.00000   0.00000   0.00000   0.69369  -0.21648  -0.10501   0.09706
 -0.03683  -0.10512   0.02849
   0.00000   0.00000   0.00000   0.00000   0.60165   0.37106  -0.00193
 -0.23392   0.10109  -0.09897
   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000  -0.00000
 -0.00000  -0.00000   0.00000
   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000
 -0.00000  -0.00000  -0.00000
   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000
0.00000  -0.00000   0.00000
   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000
0.00000   0.00000  -0.00000
   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000   0.00000
0.00000   0.00000   0.00000

I think there are some other differences here but probably not meaningful
in this context. For example I was doing implicit-feedback ALS. (But the
result above is from an Octave implementation of "regular" ALS like what
your'e running)

There are a bunch of useful thoughts here I am going to both read up and
explore as conditions.


On Thu, Apr 4, 2013 at 8:54 PM, Koobas <[email protected]> wrote:

> BTW, my initialization of X and Y is simply random:
> X = rand(m,k);
> Y = rand(k,n);
>
>
>
> On Thu, Apr 4, 2013 at 3:51 PM, Koobas <[email protected]> wrote:
>
>> It's done in one iteration.
>>  This is the R from QR factorization:
>>
>>     5.0663    5.8122    4.9704    4.3987    6.3400    4.5970    5.0334
>> 4.2581    3.3808    5.3250
>>          0    2.4036    1.1722    2.3296    1.6580    0.4575    1.1706
>> 2.1040    1.6738    1.4839
>>          0         0    1.5085    0.0966    1.2581    0.5236    0.4712
>> -0.0411    0.3143    0.5957
>>          0         0         0    1.8682    0.1834   -0.3244   -0.0073
>> 0.3817    1.1673    0.4783
>>          0         0         0         0    1.9569    0.8666    0.3201
>> -0.4167    0.0732    0.3114
>>          0         0         0         0         0    1.3520    0.2326
>> -0.1156   -0.2793    0.0103
>>          0         0         0         0         0         0    1.1689
>> 0.3151    0.0590    0.0435
>>          0         0         0         0         0         0         0
>> 1.6296   -0.3494   -0.0024
>>          0         0         0         0         0         0
>> 0         0    1.4307    0.1803
>>          0         0         0         0         0         0
>> 0         0         0    1.1404
>>
>>
>>
>>

Reply via email to