Github user yanboliang commented on a diff in the pull request:
https://github.com/apache/spark/pull/17274#discussion_r105818725
--- Diff: R/pkg/inst/tests/testthat/test_context.R ---
@@ -177,6 +177,13 @@ test_that("add and get file to be downloaded with
Spark job on every node", {
spark.addFile(path)
download_path <- spark.getSparkFiles(filename)
expect_equal(readLines(download_path), words)
+
+ # Test spark.getSparkFiles works well on executors.
+ seq <- seq(from = 1, to = 10, length.out = 5)
+ f <- function(seq) { readLines(spark.getSparkFiles(filename)) }
+ results <- spark.lapply(seq, f)
+ for (i in 1:5) { expect_equal(results[[i]], words) }
+
--- End diff --
```
Failed
-------------------------------------------------------------------------
1. Error: add and get file to be downloaded with Spark job on every node
(@test_context.R#184)
org.apache.spark.SparkException: Job aborted due to stage failure: Task 2
in stage 0.0 failed 1 times, most recent failure: Lost task 2.0 in stage 0.0
(TID 2, localhost, executor driver): org.apache.spark.SparkException: R
computation failed with
[1] 3
[1] 2
[1] 3
[1][1] 1 1
[1] 2
[1] 2
[1] 2
[1] 2
[1] 2
[1] 2
[1] 2
cannot open the connection
In addition: Warning message:
In file(con, "r") :
cannot open file
'/tmp/spark-82bf379c-0f31-47c0-8ac6-6b764c3cfc90/userFiles-1327d8ac-889e-4861-8263-4f270697fb85/hello221b332ef87f.txt':
No such file or directory
```
The error is weird, since it can pass if I paste these code to SparkR
console. And it also pass if I write these code in a separate script and
submitting with ```bin/spark-submit```. Any thoughts? cc @felixcheung @shivaram
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]