huaxingao commented on a change in pull request #28960:
URL: https://github.com/apache/spark/pull/28960#discussion_r448023875



##########
File path: mllib/src/main/scala/org/apache/spark/ml/regression/FMRegressor.scala
##########
@@ -47,7 +47,7 @@ import org.apache.spark.storage.StorageLevel
  */
 private[ml] trait FactorizationMachinesParams extends PredictorParams
   with HasMaxIter with HasStepSize with HasTol with HasSolver with HasSeed
-  with HasFitIntercept with HasRegParam {
+  with HasFitIntercept with HasRegParam with HasWeightCol {

Review comment:
       Add ```with HasWeightCol``` because ```ClassificationSummary``` uses 
weigthCol. However, FM doesn't really support instance weight yet and all the 
weight are default to 1.0.

##########
File path: 
mllib/src/main/scala/org/apache/spark/mllib/optimization/GradientDescent.scala
##########
@@ -195,7 +195,7 @@ object GradientDescent extends Logging {
         s"numIterations=$numIterations and 
miniBatchFraction=$miniBatchFraction")
     }
 
-    val stochasticLossHistory = new ArrayBuffer[Double](numIterations)
+    val stochasticLossHistory = new ArrayBuffer[Double](numIterations + 1)

Review comment:
       Make this stochasticLossHistory contain initial state + the state for 
each iteration, so it is consistent with the objectiveHistory in 
LogisticRegression and LinearRegression




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to