Hi users,
I'm a bit of newbie to spark infrastructure. And i have a small doubt.
I have a maven project with plugins generated separately in a folder and
normal java command to run is as follows:
`java -Dp4j.pluginsDir=./plugins -jar /path/to/jar`

Now when I run this program in local with spark-submit with standalone
cluster(not cluster mode) the program compiles and plugins are in "plugins"
folder in the $SPARK_HOME and it is getting recognised.
The same is not the case in cluster mode. It says the Extenstion point is
not loaded. Please advise on how can i create a folder which can be shared
among the workers in "plugin" folder.

PS: HDFS is not an options as we dont have a different setup

Thanks.


*Regards*
  Shashanka Balakuntala Srinivasa

Reply via email to