What about putting a custom als implementation into sparks name space?
harini <philly.har...@gmail.com> schrieb am Do. 8. Dez. 2016 um 00:01:

> Hi all, I am trying to implement ALS with a slightly modified objective
> function, which will require minor changes to fit -> train ->
> computeFactors within ALS.scala
> <https://github.com/apache/spark/blob/v1.6.2/mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala>
> - Is there a way to do this without having to build spark in its entirety?
> ------------------------------
> View this message in context: modifications to ALS.scala
> <http://apache-spark-developers-list.1001551.n3.nabble.com/modifications-to-ALS-scala-tp20167.html>
> Sent from the Apache Spark Developers List mailing list archive
> <http://apache-spark-developers-list.1001551.n3.nabble.com/> at
> Nabble.com.
>

Reply via email to