[ 
https://issues.apache.org/jira/browse/SPARK-12148?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15208272#comment-15208272
 ] 

Sun Rui edited comment on SPARK-12148 at 3/23/16 11:25 AM:
-----------------------------------------------------------

Since R is dynamic language (no need to specify type when using a variable), 
and SparkR users do not directly new instances of DataFrame but call some 
factory methods like createDataFrame(), read.df() to get a DataFrame, it is 
supposed that renaming DataFrame to SparkDataFrame is generally an internal 
change and should have less impact on the existing SparkR client code (if the 
code does not check class name "DataFrame" somewhere. generally no need to do 
so. But it is possible that some client code may be broken by this change).
[~shivaram], [~rxin] do you agree that we do this change for Spark 2.0?


was (Author: sunrui):
Since R is dynamic language (no need to specify type when using a variable), it 
is supposed that renaming DataFrame to SparkDataFrame is generally an internal 
change and should have less impact on the existing SparkR client code (if the 
code does not check class name "DataFrame" somewhere. generally no need to do 
so. But it is possible that some client code may be broken by this change).
[~shivaram], [~rxin] do you agree that we do this change for Spark 2.0?

> SparkR: rename DataFrame to SparkDataFrame
> ------------------------------------------
>
>                 Key: SPARK-12148
>                 URL: https://issues.apache.org/jira/browse/SPARK-12148
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SparkR
>            Reporter: Michael Lawrence
>
> The SparkR package represents a Spark DataFrame with the class "DataFrame". 
> That conflicts with the more general DataFrame class defined in the S4Vectors 
> package. Would it not be more appropriate to use the name "SparkDataFrame" 
> instead?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to