Hi, 

This is about spark 0.9. 
I have a 3 node spark cluster. I want to add a locally available jarfile
(present on all nodes) to the SPARK_CLASPATH variable in
/etc/spark/conf/spark-env.sh  so that all nodes can access it.

Question is,
should I edit 'spark-env.sh' on all nodes to add the jar  ?
Or, is it enough to add it only in the master node from where I am
submitting jobs?

thanks
Santhosh



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/question-about-setting-SPARK-CLASSPATH-IN-spark-env-sh-tp7809.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to