[ 
https://issues.apache.org/jira/browse/MAHOUT-1365?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dmitriy Lyubimov updated MAHOUT-1365:
-------------------------------------

    Attachment: distributed-als-with-confidence.pdf

Oh. the confidence matrix C is not sparse per se. but if there's a base 
confidence c_0 such that subtracting it from each element of C turns it into 
sparse matrix C', then we can use that matrix as an input (along with c_0 
parameter). This is further clarified in the attachment (which is basically 
just a conspect of both papers for my own sake.) See attached.

> Weighted ALS-WR iterator for Spark
> ----------------------------------
>
>                 Key: MAHOUT-1365
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1365
>             Project: Mahout
>          Issue Type: Task
>            Reporter: Dmitriy Lyubimov
>            Assignee: Dmitriy Lyubimov
>             Fix For: Backlog
>
>         Attachments: distributed-als-with-confidence.pdf
>
>
> Given preference P and confidence C distributed sparse matrices, compute 
> ALS-WR solution for implicit feedback (Spark Bagel version).
> Following Hu-Koren-Volynsky method (stripping off any concrete methodology to 
> build C matrix), with parameterized test for convergence.
> The computational scheme is followsing ALS-WR method (which should be 
> slightly more efficient for sparser inputs). 
> The best performance will be achieved if non-sparse anomalies prefilitered 
> (eliminated) (such as an anomalously active user which doesn't represent 
> typical user anyway).
> the work is going here 
> https://github.com/dlyubimov/mahout-commits/tree/dev-0.9.x-scala. I am 
> porting away our (A1) implementation so there are a few issues associated 
> with that.



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Reply via email to