Hello everyone,

We're having a serious issue, where we get ClassNotFoundException because,
apparently the class is not found within the classpath of Spark, in both
the Driver and Executors.

First, I checked whether the class was actually within the jar with jar tf,
and there actually is. Then, I activated the following options to see which
classes are actually loaded:

--conf 'spark.driver.extraJavaOptions=-verbose:class' --conf
'spark.executor.extraJavaOptions=-verbose:class'

and I can see from the YARN stdout logs that some classes, just like the
one throwing the exception, are not actually being loaded while other
classes are.
I tried, then, using --jars to pass the jar containing the missing classes,
and also using addJar() from the spark context, to no avail.

This looks like an issue with Spark class loader.

Any idea about what's happenig here? I'm using Spark 2.3.1.3.0.0.0-1634
(HDP 3.0).

Thank you for your help,
Federico

Reply via email to