We solved this by adding to spark-class script. At the bottom before the exec
statement we intercepted the command that was constructed and injected our
additional class path :
for ((i=0; i<${#CMD[@]}; i++));
do
if [[ ${CMD[$i]} == *"$SPARK_ASSEMBLY_JAR"* ]]
then
CMD[$i]=
Hi,
Thanks for the suggestion -- but those classpaths config options only affect
the driver and executor processes -- not the standalone mode daemons (master
and slave). Incidentally we have the extra jars we need set there.
I went through the docs but couldn't find a place to set extra classpa
Have you tried using
spark.driver.extraClassPath
and
spark.executor.extraClassPath
?
AFAICT these config options replace SPARK_CLASSPATH. Further info in the
docs. I've had good luck with these options, and for ease of use I just set
them in the spark defaults config.
https://spark.apache.org/do
Hi,
We are running a Spark Standalone cluster on EMR (note: not using YARN) and
are trying to use S3 w/ EmrFS as our event logging directory.
We are having difficulties with a ClassNotFoundException on EmrFileSystem
when we navigate to the event log screen. This is to be expected as the
EmrFs jar