You can extend Gradient, e.g., https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/optimization/Gradient.scala#L266, and use it in GradientDescent: https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/optimization/GradientDescent.scala#L149. Please note that this is a developer API. -Xiangrui
On Fri, Mar 27, 2015 at 7:11 PM, shmoanne <jls...@eng.ucsd.edu> wrote: > I am working with the mllib.optimization.GradientDescent class and I'm > confused about how to set a custom loss function with setGradient? > > For instance, if I wanted my loss function to be x^2 how would I go about > setting it using setGradient? > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Setting-a-custom-loss-function-for-GradientDescent-tp22263.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org