Could you share what's the cluster manager you're using and exactly
where the error shows up (driver or executor)?

A quick look reveals that Standalone and Yarn use different options to
control this, for example. (Maybe that already should be a bug.)

On Mon, Aug 11, 2014 at 12:24 PM, DNoteboom <dan...@wibidata.com> wrote:
> Currently my code uses commons-pool version 1.6 but Spark uses commons-pool
> version 1.54. This causes an error when I try to access a method that is
> visible in 1.6 but not in 1.54. I tried to fix this by setting the
> userClassPathFirst=true(and I verified that this was set correctly in
> http://<driver>:4040 and that my jars are listed in the user-jars). The
> problem did not go away which means that Spark is not functioning correctly.
> I added commons-pool-version 1.6 in front of CLASS_PATH in the
> bin/compute_classpath.sh file and this got rid of my problem, but this is a
> hacky fix and not a long term solution.
>
> I have looked through the Spark source code and it appears to check the
> URLClassLoader for the user classpath first. I have tried to determine how
> the user jars are being added to the list of URLs to the classloader with
> little success. At this point I was trying to debug by installing core-spark
> so I can edit the source code and then injecting the modified .class files
> into the Spark-assembly-jar that is in spark-hadoop with little success.
> Does anyone know why this doesn't work, or have any solutions for this?
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-files-userClassPathFirst-true-Not-Working-Correctly-tp11917.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to