[
https://issues.apache.org/jira/browse/SPARK-18823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15810634#comment-15810634
]
Felix Cheung commented on SPARK-18823:
--------------------------------------
I think to Shivaram, this is a bit tricky since we are making assumption that
the column data can fit in memory of a single node (where the R client is
running). Even then, we would need to handle a potentially large amount of data
to serialze and distribute and so on.
> Assignation by column name variable not available or bug?
> ---------------------------------------------------------
>
> Key: SPARK-18823
> URL: https://issues.apache.org/jira/browse/SPARK-18823
> Project: Spark
> Issue Type: Question
> Components: SparkR
> Affects Versions: 2.0.2
> Environment: RStudio Server in EC2 Instances (EMR Service of AWS) Emr
> 4. Or databricks (community.cloud.databricks.com) .
> Reporter: Vicente Masip
> Original Estimate: 24h
> Remaining Estimate: 24h
>
> I really don't know if this is a bug or can be done with some function:
> Sometimes is very important to assign something to a column which name has to
> be access trough a variable. Normally, I have always used it with doble
> brackets likes this out of SparkR problems:
> # df could be faithful normal data frame or data table.
> # accesing by variable name:
> myname = "waiting"
> df[[myname]] <- c(1:nrow(df))
> # or even column number
> df[[2]] <- df$eruptions
> The error is not caused by the right side of the "<-" operator of assignment.
> The problem is that I can't assign to a column name using a variable or
> column number as I do in this examples out of spark. Doesn't matter if I am
> modifying or creating column. Same problem.
> I have also tried to use this with no results:
> val df2 = withColumn(df,"tmp", df$eruptions)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]