[ https://issues.apache.org/jira/browse/SPARK-29777?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-29777: ---------------------------------- Affects Version/s: 2.3.4 > SparkR::cleanClosure aggressively removes a function required by user function > ------------------------------------------------------------------------------ > > Key: SPARK-29777 > URL: https://issues.apache.org/jira/browse/SPARK-29777 > Project: Spark > Issue Type: Bug > Components: SparkR > Affects Versions: 2.3.4, 2.4.4 > Reporter: Hossein Falaki > Assignee: Hossein Falaki > Priority: Major > Fix For: 3.0.0 > > > Following code block reproduces the issue: > {code} > df <- createDataFrame(data.frame(x=1)) > f1 <- function(x) x + 1 > f2 <- function(x) f1(x) + 2 > dapplyCollect(df, function(x) { f1(x); f2(x) }) > {code} > We get following error message: > {code} > org.apache.spark.SparkException: R computation failed with > Error in f1(x) : could not find function "f1" > Calls: compute -> computeFunc -> f2 > {code} > Compare that to this code block with succeeds: > {code} > dapplyCollect(df, function(x) { f2(x) }) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org