Thanks, I hope this problem will go away once I upgrade to spark 1.0 where we
can send the clusterwide classpaths using spark-submit command
--
View this message in context:
by the way, any idea how to sync the spark config dir with other nodes in the
cluster?
~santhosh
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/question-about-setting-SPARK-CLASSPATH-IN-spark-env-sh-tp7809p7853.html
Sent from the Apache Spark User List
Hi,
This is about spark 0.9.
I have a 3 node spark cluster. I want to add a locally available jarfile
(present on all nodes) to the SPARK_CLASPATH variable in
/etc/spark/conf/spark-env.sh so that all nodes can access it.
Question is,
should I edit 'spark-env.sh' on all nodes to add the jar ?