Great.

On Tue, Jan 8, 2013 at 4:25 PM, Koobas <[email protected]> wrote:

> On Tue, Jan 8, 2013 at 7:18 PM, Ted Dunning <[email protected]> wrote:
>
> > But is it actually QR of Y?
> >
> >
> Ted,
> This is my understanding:
> In the process of solving the least squares problem,
> you end up inverting a small square matrix (Y' * Y)-1.
> How it is done is irrelevant.
> Since the matrix is square, one could do LU factorization, a.k.a. Gaussian
> elimination.
> However, since we are talking here about solving an 100x100 problem,
> one might as well do it with QR factorization which, unlike LU, is stable
> "no matter what".
>
>
>
> > On Tue, Jan 8, 2013 at 3:41 PM, Sean Owen <[email protected]> wrote:
> >
> > > There's definitely a QR decomposition in there for me since solving A
> > > = X Y' for X  is  X = A Y (Y' * Y)^-1  and you need some means to
> > > compute the inverse of that (small) matrix.
> > >
> > > On Tue, Jan 8, 2013 at 5:27 PM, Ted Dunning <[email protected]>
> > wrote:
> > > > This particular part of the algorithm can be seen as similar to a
> least
> > > > squares problem that might normally be solved by QR.  I don't think
> > that
> > > > the updates are quite the same, however.
> > > >
> > > > On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <[email protected]>
> > > wrote:
> > > >
> > > >> This factorization is iteratively refined. In each iteration, ALS
> > first
> > > >> fixes the item-feature vectors and solves a least-squares problem
> for
> > > >> each user and then fixes the user-feature vectors and solves a
> > > >> least-squares problem for each item.
> > > >>
> > >
> >
>

Reply via email to