This is how i used to do it:

                          *// Create a list of jars*

>                         List<String> jars =
> Lists.newArrayList("/home/akhld/mobi/localcluster/x/spark-0.9.1-bin-hadoop2/assembly/target/scala-2.10/spark-assembly-0.9.1-hadoop2.2.0.jar","ADD-All-The-Jars-Here
> ");



>                         *// Create a SparkConf*
> SparkConf spconf = new SparkConf();
> spconf.setMaster("local");
> spconf.setAppName("YourApp");
>
> spconf.setSparkHome("/home/akhld/mobi/localcluster/x/spark-0.9.1-bin-hadoop2");
> *spconf.setJars(jars.toArray(new String[jars.size()]));*
> spconf.set("spark.executor.memory", "1g");
>
                                                    *// Now create the
context.*

> JavaStreamingContext jsc = new JavaStreamingContext(spconf,new
> Duration(10000));
>

Thanks
Best Regards


On Mon, Aug 11, 2014 at 9:36 PM, lbustelo <g...@bustelos.com> wrote:

> Not sure if this problem reached the Spark guys because it shows in Nabble
> that "This post has NOT been accepted by the mailing list yet".
>
>
> http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFound-for-user-class-in-uber-jar-td10613.html#a11902
>
> I'm resubmitting.
>
> Greetings,
>
> I'm currently building a "fat" or "uber" jar with dependencies using maven
> that a docker-ized Spark cluster (1 master 3 workers, version 1.0.0, scala
> 2.10.4) points to locally on the same VM. It seems that sometimes a
> particular class is found, and things are fine, and other times it is not.
> Doing a find on the jar affirms that it is actually there. I `setJars` with
> JavaStreamingContext.jarOfClass(the main class).
>
> I cannot say I know much about how the ClassPath mechanisms of Spark so I
> appreciate any and all suggestions to find out what exactly is happening.
>
> The exception is as follows:
>
> 14/07/24 18:48:52 INFO Executor: Sending result for 139 directly to driver
> 14/07/24 18:48:52 INFO Executor: Finished task ID 139
> 14/07/24 18:48:56 WARN BlockManager: Putting block input-0-1406227713800
> failed
> 14/07/24 18:48:56 ERROR BlockManagerWorker: Exception handling buffer
> message
> java.lang.ClassNotFoundException: com.cjm5325.MyProject.MyClass
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:270)
>     at
>
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:60)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
>
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
>     at
>
> org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:125)
>     at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
>     at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>     at org.apache.spark.util.NextIterator.foreach(NextIterator.scala:21)
>     at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
>     at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
>     at org.apache.spark.storage.MemoryStore.putBytes(MemoryStore.scala:59)
>     at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:666)
>     at
> org.apache.spark.storage.BlockManager.putBytes(BlockManager.scala:587)
>     at
>
> org.apache.spark.storage.BlockManagerWorker.putBlock(BlockManagerWorker.scala:82)
>     at
>
> org.apache.spark.storage.BlockManagerWorker.processBlockMessage(BlockManagerWorker.scala:63)
>     at
>
> org.apache.spark.storage.BlockManagerWorker$$anonfun$2.apply(BlockManagerWorker.scala:44)
>     at
>
> org.apache.spark.storage.BlockManagerWorker$$anonfun$2.apply(BlockManagerWorker.scala:44)
>     at
>
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>     at
>
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>     at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>     at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>     at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>     at
>
> org.apache.spark.storage.BlockMessageArray.foreach(BlockMessageArray.scala:28)
>     at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>     at
> org.apache.spark.storage.BlockMessageArray.map(BlockMessageArray.scala:28)
>     at
>
> org.apache.spark.storage.BlockManagerWorker.onBlockMessageReceive(BlockManagerWorker.scala:44)
>     at
>
> org.apache.spark.storage.BlockManagerWorker$$anonfun$1.apply(BlockManagerWorker.scala:34)
>     at
>
> org.apache.spark.storage.BlockManagerWorker$$anonfun$1.apply(BlockManagerWorker.scala:34)
>     at
> org.apache.spark.network.ConnectionManager.org
> $apache$spark$network$ConnectionManager$$handleMessage(ConnectionManager.scala:661)
>     at
>
> org.apache.spark.network.ConnectionManager$$anon$9.run(ConnectionManager.scala:503)
>     at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:744)
>
>
> Some classpath information:
>
> Classpath Entries
>
> Resource (Source)
> /opt/spark-1.0.0/conf  (System Classpath)
> /opt/spark-1.0.0/lib/spark-assembly-1.0.0-hadoop1.0.4.jar (System
> Classpath)
>
> /opt/spark-1.0.0/work/driver-20140724160201-0002/my-jar-with-dependencies.jar
> (System Classpath)
> http://spark.filesystem.uri/jars/my-jar-with-dependencies.jar (Added by
> user)
>
>
> If more information is needed, please let me know and I will be glad to
> provide it. I have searched for similar issues with mild success, but I
> cannot reason why sometimes the class would be loaded and others not.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFound-exception-on-class-in-uber-jar-tp11903.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to