falaki commented on a change in pull request #26429: [SPARK-29777][SparkR]
SparkR::cleanClosure aggressively removes a function required by user function
URL: https://github.com/apache/spark/pull/26429#discussion_r346133031
##########
File path: R/pkg/R/utils.R
##########
@@ -546,8 +546,11 @@ processClosure <- function(node, oldEnv, defVars,
checkedFuncs, newEnv) {
ifelse(identical(func, obj), TRUE, FALSE)
})
if (sum(found) > 0) {
- # If function has been examined, ignore.
- break
+ # If function has been examined
+ if (identical(parent.env(environment(funcList[found][[1]])),
func.env)) {
Review comment:
Yes, but note that such a function will get `stack overflow` during
execution anyway. So the code is not runnable. SparkR does not expose
`cleanClosure`. If user passes your function, `a()` to `dapply()`, `gapply()`
or `spark.lapply()` they will get `Error: node stack overflow` on the workers
anyway.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]