That's right, though 80% of the algorithm is in how you efficiently do
"A \ Users", and in the presence of an overfitting term. You can
definitely do it sparse-ly when the loss function is set up cleverly.

I tried to explain it here, from slide 6:
http://www.slideshare.net/srowen/big-practical-recommendations-with-alternating-least-squares

On Tue, Jan 8, 2013 at 5:16 PM, Mohit Singh <[email protected]> wrote:
>    Pseudo code for ALS in octave
>
>    Users = rand(m,n);
>    for i=1:iterations
>       Items = A \ Users;
>       Users = A' \ Items;
>    endfor
>    Start with a random guess of the matrix Users,(mXn)
>    and then you compute matrix Items. which is a least squares solution to
>    Users
>    and then calculate Users which is a least square solution to Items and
>    iterate until convergence.
>
>
>
>
> On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <[email protected]> wrote:
>
>> Koobas,
>>
>> ALS doesn't apply QR decomposition to the user-item-matrix. It
>> factorizes this matrix into two smaller matrices which contain the user
>> and item features.
>>
>> Each user-feature vector is modeled as a linear combination of the
>> feature vectors of the items which the user rated (and vice versa for
>> the item-feature vectors).
>>
>> This factorization is iteratively refined. In each iteration, ALS first
>> fixes the item-feature vectors and solves a least-squares problem for
>> each user and then fixes the user-feature vectors and solves a
>> least-squares problem for each item.
>>
>> I suggest that you read the papers I referred to in the last mail, which
>> in detail explain the algorithm.
>>
>> Best,
>> Sebastian
>>
>>
>> On 08.01.2013 23:55, Koobas wrote:
>> > I am trying to wrap my head around it.
>> > From the Mahout documentation it looks like it's a standard (dense) QR
>> with
>> > Householder reflectors.
>> > But the user-item matrix is usually extremely sparse.
>> > So, is the sparsity somehow taken into account,
>> > or are the sparse right-hand-side vectors packed in dense storage and hit
>> > with Householder?
>> > The underlying question being the computational complexity, i.e. number
>> of
>> > floating point operations involved.
>> >
>> >
>> > On Tue, Jan 8, 2013 at 4:03 PM, Sebastian Schelter <[email protected]>
>> wrote:
>> >
>> >> Hi Koobas,
>> >>
>> >> We have two classes that implement the solutions described in the
>> >> ALS-related papers:
>> >>
>> >> For explicit feedback data [1] we have AlternatingLeastSquaresSolver,
>> >> for implicit feedback data [2] we have
>> >> ImplicitFeedback-AlternatingLeastSquaresSolver. Both can be found in the
>> >> org.apache.mahout.math.als package.
>> >>
>> >> Internally the use Mahout's QRDecomposition to solve the linear systems
>> >> associated with ALS.
>> >>
>> >> Best,
>> >> Sebastian
>> >>
>> >> [1]
>> >>
>> >>
>> http://www.hpl.hp.com/personal/Robert_Schreiber/papers/2008%20AAIM%20Netflix/netflix_aaim08(submitted).pdf
>> >> [2] http://research.yahoo.com/pub/2433
>> >>
>> >>
>> >> On 08.01.2013 21:53, Koobas wrote:
>> >>> Perhaps somebody can shed some light on the topic.
>> >>> What algorithm is used to solve the least squares problem
>> >>> when computing low-rank approximation using Alternating Least Squares?
>> >>>
>> >>
>> >>
>> >
>>
>>
>
>
> --
> Mohit
>
> "When you want success as badly as you want the air, then you will get it.
> There is no other secret of success."
> -Socrates

Reply via email to