Ratan Rai Sur created SPARK-21685: ------------------------------------- Summary: Params isSet in scala Transformer triggered by _setDefault in pyspark Key: SPARK-21685 URL: https://issues.apache.org/jira/browse/SPARK-21685 Project: Spark Issue Type: Bug Components: PySpark Affects Versions: 2.1.0 Reporter: Ratan Rai Sur
I'm trying to write a PySpark wrapper for a Transformer whose transform method includes the line {code:scala} require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both outputNodeName and outputNodeIndex") {code} This should only throw an exception when both of these parameters are explicitly set. In the PySpark wrapper for the Transformer, there is this line in ___init___ {code:python} self._setDefault(outputNodeIndex=0) {code} Here is the line in the main python script showing how it is being configured {code:python} cntkModel = CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, model.uri).setOutputNodeName("z") {code} As you can see, only setOutputNodeName is being explicitly set but the exception is still being thrown. If you need more context, https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the branch with the code, the files I'm referring to here are CNTKModel.scala and _CNTKModel.py -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org