Can you refer to which documentation you are looking at?

ALS is more like a block gradient descent SVD than like a QR.  There are
relationships in each step, but I don't think that they are identical.

Others can comment more authoritatively.

On Tue, Jan 8, 2013 at 2:55 PM, Koobas <[email protected]> wrote:

> I am trying to wrap my head around it.
> From the Mahout documentation it looks like it's a standard (dense) QR with
> Householder reflectors.
> But the user-item matrix is usually extremely sparse.
> So, is the sparsity somehow taken into account,
> or are the sparse right-hand-side vectors packed in dense storage and hit
> with Householder?
> The underlying question being the computational complexity, i.e. number of
> floating point operations involved.
>
>
> On Tue, Jan 8, 2013 at 4:03 PM, Sebastian Schelter <[email protected]> wrote:
>
> > Hi Koobas,
> >
> > We have two classes that implement the solutions described in the
> > ALS-related papers:
> >
> > For explicit feedback data [1] we have AlternatingLeastSquaresSolver,
> > for implicit feedback data [2] we have
> > ImplicitFeedback-AlternatingLeastSquaresSolver. Both can be found in the
> > org.apache.mahout.math.als package.
> >
> > Internally the use Mahout's QRDecomposition to solve the linear systems
> > associated with ALS.
> >
> > Best,
> > Sebastian
> >
> > [1]
> >
> >
> http://www.hpl.hp.com/personal/Robert_Schreiber/papers/2008%20AAIM%20Netflix/netflix_aaim08(submitted).pdf
> > [2] http://research.yahoo.com/pub/2433
> >
> >
> > On 08.01.2013 21:53, Koobas wrote:
> > > Perhaps somebody can shed some light on the topic.
> > > What algorithm is used to solve the least squares problem
> > > when computing low-rank approximation using Alternating Least Squares?
> > >
> >
> >
>

Reply via email to