How to get the working directory in executor

2016-01-13 Thread Byron Wang
I am using the following command to submit Spark job, I hope to send jar and config files to each executor and load it there spark-submit --verbose \ --files=/tmp/metrics.properties \ --jars /tmp/datainsights-metrics-source-aembly-1.0.jar \ --total-executor-cores 4\ --conf

Re: How to get the working directory in executor

2016-01-13 Thread Ted Yu
Can you place metrics.properties and datainsights-metrics-source-assembly-1.0.jar on hdfs ? Cheers On Wed, Jan 13, 2016 at 8:01 AM, Byron Wang wrote: > I am using the following command to submit Spark job, I hope to send jar > and > config files to each executor and load it

Re: How to get the working directory in executor

2016-01-13 Thread Ted Yu
In a bit more detail: You upload the files using 'hdfs dfs -copyFromLocal' command Then specify hdfs location of the files on the command line. Cheers On Wed, Jan 13, 2016 at 8:05 AM, Ted Yu wrote: > Can you place metrics.properties and >