[ https://issues.apache.org/jira/browse/SPARK-6290?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14511540#comment-14511540 ]
Glenn Weidner commented on SPARK-6290: -------------------------------------- After synchronizing with latest from master, observed (by comparing usage in org.apache.spark.ml.impl.estimator.Predictor.validateAndTransformSchema) that spark.ml.param.Params.checkInputColumn has been replaced with org.apache.spark.ml.util.SchemaUtils.checkColumnType. In addition, the call to Params.getParam has been removed from the require statement: new version checkColumnType: require(actualDataType.equals(dataType), s"Column $colName must be of type $dataType but was actually $actualDataType.") previous version checkInputColumn: require(actualDataType.equals(dataType), s"Input column $colName must be of type $dataType" + s" but was actually $actualDataType. Column param description: ${getParam(colName)}") Since the call to getParam which was causing the issue has been removed, the error can no longer occur. Can this issue be marked resolved, or would it be helpful if I still added a test case? > spark.ml.param.Params.checkInputColumn bug upon error > ----------------------------------------------------- > > Key: SPARK-6290 > URL: https://issues.apache.org/jira/browse/SPARK-6290 > Project: Spark > Issue Type: Bug > Components: ML > Affects Versions: 1.3.0 > Reporter: Joseph K. Bradley > Priority: Minor > > In checkInputColumn, if data types do not match, it tries to print an error > message with this in it: > {code} > Column param description: ${getParam(colName)}" > {code} > However, getParam cannot be called on the string colName; it needs the > parameter name, which this method is not given. This causes a weird error > which users may find hard to understand. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org