[ https://issues.apache.org/jira/browse/SPARK-7675?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Joseph K. Bradley resolved SPARK-7675. -------------------------------------- Resolution: Fixed Fix Version/s: 2.0.0 Issue resolved by pull request 9581 [https://github.com/apache/spark/pull/9581] > PySpark spark.ml Params type conversions > ---------------------------------------- > > Key: SPARK-7675 > URL: https://issues.apache.org/jira/browse/SPARK-7675 > Project: Spark > Issue Type: Improvement > Components: ML, PySpark > Reporter: Joseph K. Bradley > Assignee: holdenk > Priority: Minor > Fix For: 2.0.0 > > > Currently, PySpark wrappers for spark.ml Scala classes are brittle when > accepting Param types. E.g., Normalizer's "p" param cannot be set to "2" (an > integer); it must be set to "2.0" (a float). Fixing this is not trivial > since there does not appear to be a natural place to insert the conversion > before Python wrappers call Java's Params setter method. > A possible fix will be to include a method "_checkType" to PySpark's Param > class which checks the type, prints an error if needed, and converts types > when relevant (e.g., int to float, or scipy matrix to array). The Java > wrapper method which copies params to Scala can call this method when > available. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org