Thanks Sandy. Actually found out that I had three versions of ASM lib, one in its own jar, one in mockito and one in spark. I moved the standalone one and downloaded the version of mockito-core version of the jar and things seem to be working, but for future reference, the mockito-all and asm jars come with hadoop/hive libraries and someone putting all jars in a single folder and running in a mixed Hadoop/Spark environment can run into issues.
On Thu, Feb 27, 2014 at 5:31 PM, Sandy Ryza <[email protected]> wrote: > Hi Usman, > > This is a known issue that stems from Spark dependencies using two > different versions of ASM - > https://spark-project.atlassian.net/browse/SPARK-782. How are you > setting up the classpath for your app? > > -Sandy > > > On Thu, Feb 27, 2014 at 11:59 AM, Usman Ghani <[email protected]> wrote: > >> Exception in thread "main" java.lang.IncompatibleClassChangeError: class >> org.apache.spark.util.InnerClosureFinder has interface >> org.objectweb.asm.ClassVisitor as super class >> at java.lang.ClassLoader.defineClass1(Native Method) >> at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) >> at java.lang.ClassLoader.defineClass(ClassLoader.java:615) >> at >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141) >> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) >> at java.net.URLClassLoader.access$000(URLClassLoader.java:58) >> at java.net.URLClassLoader$1.run(URLClassLoader.java:197) >> at java.security.AccessController.doPrivileged(Native Method) >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306) >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247) >> at >> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:87) >> at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107) >> at org.apache.spark.SparkContext.clean(SparkContext.scala:982) >> at org.apache.spark.rdd.RDD.map(RDD.scala:249) >> at org.apache.spark.SparkContext.textFile(SparkContext.scala:344) >> at com.platfora.SparkApp$.main(sparkApp.scala:66) >> at com.platfora.SparkApp.main(sparkApp.scala) >> > >
