Maybe that class is already loaded as part of a core library of Spark?

Do you have concrete class names?

In doubt create a fat jar and shade the dependencies in question

> Am 18.03.2019 um 12:34 schrieb Federico D'Ambrosio <fedex...@gmail.com>:
> 
> Hello everyone,
> 
> We're having a serious issue, where we get ClassNotFoundException because, 
> apparently the class is not found within the classpath of Spark, in both the 
> Driver and Executors.
> 
> First, I checked whether the class was actually within the jar with jar tf, 
> and there actually is. Then, I activated the following options to see which 
> classes are actually loaded:
> 
> --conf 'spark.driver.extraJavaOptions=-verbose:class' --conf 
> 'spark.executor.extraJavaOptions=-verbose:class' 
> 
> and I can see from the YARN stdout logs that some classes, just like the one 
> throwing the exception, are not actually being loaded while other classes are.
> I tried, then, using --jars to pass the jar containing the missing classes, 
> and also using addJar() from the spark context, to no avail.
> 
> This looks like an issue with Spark class loader.
> 
> Any idea about what's happenig here? I'm using Spark 2.3.1.3.0.0.0-1634 (HDP 
> 3.0). 
> 
> Thank you for your help,
> Federico

Reply via email to