[ 
https://issues.apache.org/jira/browse/SPARK-7008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14504369#comment-14504369
 ] 

Xiangrui Meng commented on SPARK-7008:
--------------------------------------

[~podongfeng] You implementation assumes that the model can be stored locally, 
which is not true for big models. [~gq]'s GraphX-based implementation should 
have better scalability, but slower on small datasets. We need more time to 
understand the algorithm and decide whether to include it in MLlib. As Sean 
suggested, it would be nice if you can submit both packages to 
spark-pacakges.org. 

[~podongfeng] and [~gq], I like the simplicity and the expressiveness of FM. I 
have a few questions to understand FM better. FM uses SGD on a non-convex 
objective. What is FM's convergence rate you observed in practice? Does it 
sensitive to local minimals (run FM multiple times and see whether there are 
big variance on the objective values)? Does it sensitive to the learning rate?


> An Implement of Factorization Machine (LibFM)
> ---------------------------------------------
>
>                 Key: SPARK-7008
>                 URL: https://issues.apache.org/jira/browse/SPARK-7008
>             Project: Spark
>          Issue Type: New Feature
>          Components: MLlib
>    Affects Versions: 1.3.0, 1.3.1, 1.3.2
>            Reporter: zhengruifeng
>              Labels: features, patch
>
> An implement of Factorization Machines based on Scala and Spark MLlib.
> Factorization Machine is a kind of machine learning algorithm for 
> multi-linear regression, and is widely used for recommendation.
> Factorization Machines works well in recent years' recommendation 
> competitions.
> Ref:
> http://libfm.org/
> http://doi.acm.org/10.1145/2168752.2168771
> http://www.inf.uni-konstanz.de/~rendle/pdf/Rendle2010FM.pdf



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to