But is it actually QR of Y?

On Tue, Jan 8, 2013 at 3:41 PM, Sean Owen <[email protected]> wrote:

> There's definitely a QR decomposition in there for me since solving A
> = X Y' for X  is  X = A Y (Y' * Y)^-1  and you need some means to
> compute the inverse of that (small) matrix.
>
> On Tue, Jan 8, 2013 at 5:27 PM, Ted Dunning <[email protected]> wrote:
> > This particular part of the algorithm can be seen as similar to a least
> > squares problem that might normally be solved by QR.  I don't think that
> > the updates are quite the same, however.
> >
> > On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <[email protected]>
> wrote:
> >
> >> This factorization is iteratively refined. In each iteration, ALS first
> >> fixes the item-feature vectors and solves a least-squares problem for
> >> each user and then fixes the user-feature vectors and solves a
> >> least-squares problem for each item.
> >>
>

Reply via email to