[ https://issues.apache.org/jira/browse/SPARK-4877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14306104#comment-14306104 ]
Stephen Haberman commented on SPARK-4877: ----------------------------------------- Hi Matt, I know about the caching/LinkageError issue, because I saw it running a job; my patch has a test that reproduced it and fixes it. I'm still willing to believe loadClass might be preferred, but AFAICT it's not required, and the current findClass approach is working fine (with this patch) in our production jobs. (Note that I'm very open to refactoring this approach further in the future, especially to introduce the Jetty/Hadoop concept of system classes, but our jobs have not ran into any issues.) > userClassPathFirst doesn't handle user classes inheriting from parent > --------------------------------------------------------------------- > > Key: SPARK-4877 > URL: https://issues.apache.org/jira/browse/SPARK-4877 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.2.0 > Reporter: Stephen Haberman > > We're trying out userClassPathFirst. > To do so, we make an uberjar that does not contain Spark or Scala classes > (because we want those to load from the parent classloader, otherwise we'll > get errors like scala.Function0 != scala.Function0 since they'd load from > different class loaders). > (Tangentially, some isolation classloaders like Jetty whitelist certain > packages, like spark/* and scala/*, to only come from the parent classloader, > so that technically if the user still messes up and leaks the Scala/Spark > jars into their uberjar, it won't blow up; this would be a good enhancement, > I think.) > Anyway, we have a custom Kryo registrar, which ships in our uberjar, but > since it "extends spark.KryoRegistrator", which is not in our uberjar, we get > a ClassNotFoundException. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org