Github user holdenk commented on a diff in the pull request:
https://github.com/apache/spark/pull/20058#discussion_r159020312
--- Diff: python/pyspark/ml/base.py ---
@@ -47,6 +86,28 @@ def _fit(self, dataset):
"""
raise NotImplementedError()
+ @since("2.3.0")
+ def fitMultiple(self, dataset, params):
--- End diff --
So in Scala Spark we use the `fit` function rather than separate functions.
Also the `params` name is different than the Scala one. Any reason for the
difference?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]