Hi Spark Community,

I want to call spark optimizer in pyspark for a self-defined loss function.
Do you know if pyspark can have optimization module in the future
like org.apache.spark.mllib.optimization in scala?

thank you,
Chunpeng

Reply via email to