[ https://issues.apache.org/jira/browse/SPARK-19827?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16717964#comment-16717964 ]
ASF GitHub Bot commented on SPARK-19827: ---------------------------------------- huaxingao commented on a change in pull request #23292: [SPARK-19827][R][FOLLOWUP] spark.ml R API for PIC URL: https://github.com/apache/spark/pull/23292#discussion_r240782240 ########## File path: R/pkg/R/mllib_fpm.R ########## @@ -183,8 +183,8 @@ setMethod("write.ml", signature(object = "FPGrowthModel", path = "character"), #' @return A complete set of frequent sequential patterns in the input sequences of itemsets. #' The returned \code{SparkDataFrame} contains columns of sequence and corresponding #' frequency. The schema of it will be: -#' \code{sequence: ArrayType(ArrayType(T))} (T is the item type) -#' \code{freq: Long} +#' \code{sequence: ArrayType(ArrayType(T))}, \code{freq: integer} +#' where T is the item type Review comment: I know there is a problem here too. But the one that needs to get fixed is in line 637 in R/pkg/R/mllib_clustering.R. I guess I will have a separate follow up PR to fix the mllib_fpm.R problem. Sorry for the mess. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org > spark.ml R API for PIC > ---------------------- > > Key: SPARK-19827 > URL: https://issues.apache.org/jira/browse/SPARK-19827 > Project: Spark > Issue Type: Sub-task > Components: ML, SparkR > Affects Versions: 2.1.0 > Reporter: Felix Cheung > Assignee: Huaxin Gao > Priority: Major > Fix For: 3.0.0 > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org