I have a feeling, it is a problem with the ALS algorithm.
Unlike in SVD, you're not getting orthogonal vectors.
So, X and Y matrices can be rank-deficient.
Then, multiplying X (or Y) by its transpose, squares the condition number.

In other words, you can probably make an SVD of A and get k nice singular
values,
but end up trying to invert a badly conditioned kxk matrix in ALS.
Probably wouldn't be a bad idea to check if this is indeed the case.
Brainstorming: Is there a chance a different initialization of X and Y
could improve things?




On Thu, Apr 4, 2013 at 10:26 AM, Sean Owen <[email protected]> wrote:

> I think that's what I'm saying, yes. Small rows X shouldn't become
> large rows of A -- and similarly small changes in X shouldn't mean
> large changes in A. Not quite the same thing but both are relevant. I
> see that this is just the ratio of largest and smallest singular
> values. Is there established procedure for evaluating the
> ill-conditioned-ness of matrices -- like a principled choice of
> threshold above which you say it's ill-conditioned, based on k, etc.?
>
> On Thu, Apr 4, 2013 at 3:19 PM, Koobas <[email protected]> wrote:
> > So, the problem is that the kxk matrix is ill-conditioned, or is there
> more
> > to it?
> >
>

Reply via email to