PS I think the issue is really more like this, after some more testing.

When lambda (overfitting parameter) is high, the X and Y in the
factorization A = X*Y' are forced to have a small (frobenius) norm.
They underfit A, potentially a lot, if lambda is high; the values of A
are always small and can't easily reach 1 where the original input was
1.

Later you get a new click, a new row A_u = [ 0 0 ... 0 1 0 ... 0 0 ],
and you're roughly solving A_u = X_u * Y' , for X_u. But the only way
to actually get a row like that, with even one 1, given how small Y
is, is to have a very large X_u.

The simple fold-in doesn't have a concept of the loss function and (by
design) over-states the importance of the new data point, by
unilaterally trying to make the new element in A a "1". In the
presence of way-too-strong regularization, this over-statement becomes
a huge over-statement and it falls down.

Anyway -- long story short, a simple check on the inf norm of X' * X
or Y' * Y seems to suffice to decide that lambda is too big and go
complain about it rather than proceed.

On Sun, Apr 7, 2013 at 10:00 AM, Sean Owen <sro...@gmail.com> wrote:
> All that said I don't think inverting is the issue here. Using the SVD
> to invert didn't change things, and neither did actually solving the
> Ax=b problem instead of inverting A by using Householder reflections.

Reply via email to