[jira] [Commented] (SPARK-14059) Define R wrappers under org.apache.spark.ml.r
[ https://issues.apache.org/jira/browse/SPARK-14059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15238358#comment-15238358 ] Joseph K. Bradley commented on SPARK-14059: --- This task looks complete. Can I resolve it? > Define R wrappers under org.apache.spark.ml.r > - > > Key: SPARK-14059 > URL: https://issues.apache.org/jira/browse/SPARK-14059 > Project: Spark > Issue Type: Bug > Components: ML, SparkR >Affects Versions: 1.6.1 >Reporter: Xiangrui Meng >Priority: Minor > > Currently, the wrapper files are under .../ml/r but the wrapper classes are > defined under ...ml.api.r, which doesn't follow package convention. We should > move all wrappers under ml.r. > This should happen after we merged other MLlib/R wrappers to avoid merge > conflicts. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-14059) Define R wrappers under org.apache.spark.ml.r
[ https://issues.apache.org/jira/browse/SPARK-14059?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15216212#comment-15216212 ] Apache Spark commented on SPARK-14059: -- User 'yanboliang' has created a pull request for this issue: https://github.com/apache/spark/pull/12039 > Define R wrappers under org.apache.spark.ml.r > - > > Key: SPARK-14059 > URL: https://issues.apache.org/jira/browse/SPARK-14059 > Project: Spark > Issue Type: Bug > Components: ML, SparkR >Affects Versions: 1.6.1 >Reporter: Xiangrui Meng >Priority: Minor > > Currently, the wrapper files are under .../ml/r but the wrapper classes are > defined under ...ml.api.r, which doesn't follow package convention. We should > move all wrappers under ml.r. > This should happen after we merged other MLlib/R wrappers to avoid merge > conflicts. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org