Github user MLnick commented on the issue:
https://github.com/apache/spark/pull/19993
Well yes it would - but the method checks inputCols/inputCol first so will
always fail for that reason here, ie we arenât actually testing the full
code path
On Mon, 22 Jan 2018 at 16:43, Marco Gaido <[email protected]> wrote:
> *@mgaido91* commented on this pull request.
> ------------------------------
>
> In mllib/src/test/scala/org/apache/spark/ml/feature/BucketizerSuite.scala
> <https://github.com/apache/spark/pull/19993#discussion_r162955686>:
>
> > - test("Both inputCol and inputCols are set") {
> - val bucket = new Bucketizer()
> - .setInputCol("feature1")
> - .setOutputCol("result")
> - .setSplits(Array(-0.5, 0.0, 0.5))
> - .setInputCols(Array("feature1", "feature2"))
> -
> - // When both are set, we ignore `inputCols` and just map the column
specified by `inputCol`.
> - assert(bucket.isBucketizeMultipleColumns() == false)
> + test("assert exception is thrown if both multi-column and
single-column params are set") {
> + val df = Seq((0.5, 0.3), (0.5, -0.4)).toDF("feature1", "feature2")
> + ParamsSuite.testExclusiveParams(new Bucketizer, df, ("inputCol",
"feature1"),
> + ("inputCols", Array("feature1", "feature2")))
> + ParamsSuite.testExclusiveParams(new Bucketizer, df, ("outputCol",
"result1"),
> + ("outputCols", Array("result1", "result2")))
> + ParamsSuite.testExclusiveParams(new Bucketizer, df, ("splits",
Array(-0.5, 0.0, 0.5)),
>
> @MLnick <https://github.com/mlnick> actually it will fail for both
> reasons. We can add more test cases to check each of these two cases if
you
> think it is needed.
>
> â
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/spark/pull/19993#discussion_r162955686>, or
mute
> the thread
>
<https://github.com/notifications/unsubscribe-auth/AA_SB1zkYZ5V4SOlLliOtxQ_6CCvoBm4ks5tNJ6egaJpZM4RD1b4>
> .
>
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]