Github user yanboliang commented on the issue:
https://github.com/apache/spark/pull/12790
@BryanCutler @MechCoder The current fix of removing the default value for
the ```stages``` param is OK for me. But we also should discuss the behavior of
```stages=[]``` which is inconsistent between Scala and Python. In the Scala
side, if we set ```stages``` with ```Array.empty()``` or ```Array()```, it will
throw exception rather than acting as a identity transformer.
```
[Ljava.lang.Object; cannot be cast to [Lorg.apache.spark.ml.PipelineStage;
java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to
[Lorg.apache.spark.ml.PipelineStage;
```
I think we should also make it consistent between Scala and Python.
Meanwhile, would you mind updating the title of the JIRA and PR, as well as
the description, to reflect the current fix of this bug? Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]