[ 
https://issues.apache.org/jira/browse/SPARK-6837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-6837:
-----------------------------------

    Assignee: Apache Spark

> SparkR failure in processClosure
> --------------------------------
>
>                 Key: SPARK-6837
>                 URL: https://issues.apache.org/jira/browse/SPARK-6837
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.4.0
>            Reporter: Shivaram Venkataraman
>            Assignee: Apache Spark
>
> Sorry another one I can't reproduce in straight SparkR.
> This is a typical plyrmr example
> as.data.frame(gapply(input(mtcars), identity))
> Error in get(nodeChar, envir = func.env, inherits = FALSE) : 
> argument "..." is missing, with no default
> Stack trace below. This may have appeared after the introduction of the new 
> way of serializing closures. Using ... in a function alone doesn't reproduce 
> the error. So I thought I'd file this to hear what you guys think while I try 
> to isolate it better.
> > traceback()
> 28: get(nodeChar, envir = func.env, inherits = FALSE) at utils.R#363
> 27: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 26: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 25: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 24: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#324
> 23: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#312
> 22: processClosure(func.body, oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#417
> 21: cleanClosure(obj, checkedFuncs) at utils.R#381
> 20: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 19: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 18: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 17: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#312
> 16: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 15: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#312
> 14: processClosure(func.body, oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#417
> 13: cleanClosure(obj, checkedFuncs) at utils.R#381
> 12: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#339
> 11: processClosure(node[[i]], oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#312
> 10: processClosure(func.body, oldEnv, defVars, checkedFuncs, newEnv) at 
> utils.R#417
> 9: cleanClosure(FUN) at RDD.R#532
> 8: lapplyPartitionsWithIndex(X, function(s, part)
> { FUN(part) }) at generics.R#76
> 7: lapplyPartitionsWithIndex(X, function(s, part) { FUN(part) }
> ) at RDD.R#499
> 6: SparkR::lapplyPartition(rdd, f) at generics.R#66
> 5: SparkR::lapplyPartition(rdd, f)
> 4: as.pipespark(if (is.grouped(.data))
> { if (is.mergeable(.f)) SparkR::lapplyPartition(SparkR::reduceByKey(rdd, 
> f.reduce, 2L), f) else SparkR::lapplyPartition(SparkR::groupByKey(rdd, 2L), 
> f) }
> else SparkR::lapplyPartition(rdd, f), grouped = is.grouped(.data)) at 
> pipespark.R#119
> 3: gapply.pipespark(input(mtcars), identity) at pipe.R#107
> 2: gapply(input(mtcars), identity)
> 1: as.data.frame(gapply(input(mtcars), identity))
> >
> There is a proposed fix for this at 
> https://github.com/amplab-extras/SparkR-pkg/pull/229



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to