[
https://issues.apache.org/jira/browse/SPARK-13905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15200938#comment-15200938
]
Sun Rui commented on SPARK-13905:
---------------------------------
This issue is not related to as.data.frame() in SparkR. but seems due to the
DataFrame naming conflict between S4 vectors and SparkR. But it is still
valuable to change the signature of as.data.frame() to be consistent with that
in the R base package. Rename the title of this JIRA issue.
> Change implementation of as.data.frame() to avoid conflict with the ones in
> the R base package
> ----------------------------------------------------------------------------------------------
>
> Key: SPARK-13905
> URL: https://issues.apache.org/jira/browse/SPARK-13905
> Project: Spark
> Issue Type: Improvement
> Components: SparkR
> Affects Versions: 1.6.1
> Reporter: Sun Rui
>
> SparkR provides a method as.data.frame() to collect a SparkR DataFrame into a
> local data.frame. But it conflicts the methods with the same name in the R
> base package.
> For example,
> {code}
> code as follows -
> countData <- matrix(1:100,ncol=4)
> condition <- factor(c("A","A","B","B"))
> dds <- DESeqDataSetFromMatrix(countData, DataFrame(condition), ~ condition)
> Works if i dont initialize the sparkR environment.
> if I do library(SparkR) and sqlContext <- sparkRSQL.init(sc) it gives
> following error
> > dds <- DESeqDataSetFromMatrix(countData, as.data.frame(condition), ~
> > condition)
> Error in DataFrame(colData, row.names = rownames(colData)) :
> cannot coerce class "data.frame" to a DataFrame
> {code}
> The implementation of as.data.frame() in SparkR can be improved to avoid
> conflict with those in the R base package.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]