Thanks a lot for the answer! It solved my problem.
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Yes, the script should be present on all the executor nodes.
You can pass your script via spark-submit (e.g. --files script.sh) and then
you should be able to refer that (e.g. "./script.sh") in rdd.pipe.
- Arun
On Thu, 17 Jan 2019 at 14:18, Mkal wrote:
> Hi, im trying to run an external script
Hi, im trying to run an external script on spark using rdd.pipe() and
although it runs successfully on standalone, it throws an error on cluster.
The error comes from the executors and it's : "Cannot run program
"path/to/program": error=2, No such file or directory".
Does the external script need