Github user holdenk commented on a diff in the pull request:
https://github.com/apache/spark/pull/20629#discussion_r195153337
--- Diff:
mllib/src/main/scala/org/apache/spark/ml/evaluation/ClusteringEvaluator.scala
---
@@ -64,12 +65,12 @@ class ClusteringEvaluator @Since("2.3.0")
(@Since("2.3.0") override val uid: Str
/**
* param for metric name in evaluation
- * (supports `"silhouette"` (default))
+ * (supports `"silhouette"` (default), `"kmeansCost"`)
--- End diff --
Generally speaking I think it would make sense to maintain the fall-back
metric until at least Spark 3.0 at which point I think it would make sense to
ask on the user and dev lists and see if anyone is hard dependencies on it or
if it is safe to remove.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]