On 16 May 2015, at 04:39, Anton Brazhnyk 
<anton.brazh...@genesys.com<mailto:anton.brazh...@genesys.com>> wrote:

For me it wouldn’t help I guess, because those newer classes would still be 
loaded by different classloader.
What did work for me with 1.3.1 – removing of those classes from Spark’s jar 
completely, so they get loaded from external Guava (the version I prefer) and 
by the classloader I expect.


Note that Hadoop <= 2.6.0 wont' work with Guava >= 0.17; see: HADOOP-11032

FWIW Guava is a version nightmare across the hadoop stack; almost as bad as 
protobuf.jar. With Hadoop 2.7+, Hadoop will run on later versions, it'll just 
continue to ship an older one to avoid breaking apps that expect it.

Reply via email to