You can write some code e.g. A custom estimator transformer in sparks
namespace.
http://stackoverflow.com/a/40785438/2587904 might help you get started.
Be aware that using private e.g. Spark internal api might be subjected to
change from release to release.

You definitely will require spark -mllib dependency.

Currently for my usage I was not required to build a separate version of
mllib.
harini <philly.har...@gmail.com> schrieb am Do. 8. Dez. 2016 um 00:23:

> I am new to development with spark, how do I do that? Can I write up a
> custom
> implementation under package org.apache.spark.ml.recommendation, and
> specify
> "spark-mllib" along with others as a library dependency?
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/modifications-to-ALS-scala-tp20167p20169.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to