[ 
https://issues.apache.org/jira/browse/SPARK-4660?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14260544#comment-14260544
 ] 

Matei Zaharia commented on SPARK-4660:
--------------------------------------

[~pkolaczk] mind sending a pull request against http://github.com/apache/spark 
for this? It will allow us to run it through the automated tests. It looks like 
a good fix but this stuff can be tricky.

> JavaSerializer uses wrong classloader
> -------------------------------------
>
>                 Key: SPARK-4660
>                 URL: https://issues.apache.org/jira/browse/SPARK-4660
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0, 1.1.1
>            Reporter: Piotr Kołaczkowski
>            Priority: Critical
>         Attachments: spark-serializer-classloader.patch
>
>
> During testing we found failures when trying to load some classes of the user 
> application:
> {noformat}
> ERROR 2014-11-29 20:01:56 org.apache.spark.storage.BlockManagerWorker: 
> Exception handling buffer message
> java.lang.ClassNotFoundException: 
> org.apache.spark.demo.HttpReceiverCases$HttpRequest
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:270)
>       at org.apache.spark.serializer.JavaDeseriali
> zationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>       at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>       at 
> org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
>       at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
>       at 
> org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:235)
>       at 
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:126)
>       at 
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:104)
>       at org.apache.spark.storage.MemoryStore.putBytes(MemoryStore.scala:76)
>       at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:748)
>       at 
> org.apache.spark.storage.BlockManager.putBytes(BlockManager.scala:639)
>       at 
> org.apache.spark.storage.BlockManagerWorker.putBlock(BlockManagerWorker.scala:92)
>       at 
> org.apache.spark.storage.BlockManagerWorker.processBlockMessage(BlockManagerWorker.scala:73)
>       at 
> org.apache.spark.storage.BlockManagerWorker$$anonfun$2.apply(BlockManagerWorker.scala:48)
>       at 
> org.apache.spark.storage.BlockManagerWorker$$anonfun$2.apply(BlockManagerWorker.scala:48)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>       at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>       at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>       at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>       at 
> org.apache.spark.storage.BlockMessageArray.foreach(BlockMessageArray.scala:28)
>       at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>       at 
> org.apache.spark.storage.BlockMessageArray.map(BlockMessageArray.scala:28)
>       at 
> org.apache.spark.storage.BlockManagerWorker.onBlockMessageReceive(BlockManagerWorker.scala:48)
>       at 
> org.apache.spark.storage.BlockManagerWorker$$anonfun$1.apply(BlockManagerWorker.scala:38)
>       at 
> org.apache.spark.storage.BlockManagerWorker$$anonfun$1.apply(BlockManagerWorker.scala:38)
>       at 
> org.apache.spark.network.ConnectionManager.org$apache$spark$network$ConnectionManager$$handleMessage(ConnectionManager.scala:682)
>       at 
> org.apache.spark.network.ConnectionManager$$anon$10.run(ConnectionManager.scala:520)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:744)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to