Davies Liu created SPARK-9943:
---------------------------------

             Summary: Failed to serialize a deserialized UnsafeHashedRelation
                 Key: SPARK-9943
                 URL: https://issues.apache.org/jira/browse/SPARK-9943
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.5.0
            Reporter: Davies Liu
            Assignee: Davies Liu
            Priority: Critical


When the free memory is executor go low, the cached broadcast object need to 
serialized into disk, a deserialized UnsafeHashedRelation can't be serialized  
, fail with NPE

{code}
15/08/13 11:13:35 WARN TaskSetManager: Lost task 1.0 in stage 26.0 (TID 41, 
192.168.1.236): java.io.IOException: java.lang.NullPointerException
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1163)
        at 
org.apache.spark.sql.execution.joins.UnsafeHashedRelation.writeExternal(HashedRelation.scala:249)
        at 
java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1458)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1429)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
        at 
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:44)
        at 
org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:153)
        at 
org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1190)
        at 
org.apache.spark.storage.DiskStore$$anonfun$putIterator$1.apply$mcV$sp(DiskStore.scala:81)
        at 
org.apache.spark.storage.DiskStore$$anonfun$putIterator$1.apply(DiskStore.scala:81)
        at 
org.apache.spark.storage.DiskStore$$anonfun$putIterator$1.apply(DiskStore.scala:81)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1206)
        at org.apache.spark.storage.DiskStore.putIterator(DiskStore.scala:82)
        at org.apache.spark.storage.DiskStore.putArray(DiskStore.scala:66)
        at 
org.apache.spark.storage.BlockManager.dropFromMemory(BlockManager.scala:1041)
        at 
org.apache.spark.storage.BlockManager.dropFromMemory(BlockManager.scala:1002)
        at 
org.apache.spark.storage.MemoryStore$$anonfun$ensureFreeSpace$4.apply(MemoryStore.scala:468)
        at 
org.apache.spark.storage.MemoryStore$$anonfun$ensureFreeSpace$4.apply(MemoryStore.scala:457)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at 
org.apache.spark.storage.MemoryStore.ensureFreeSpace(MemoryStore.scala:457)
        at 
org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:292)
        at 
org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:165)
        at 
org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:143)
        at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:791)
        at 
org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:638)
        at 
org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:996)
        at 
org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:182)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1175)
        at 
org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:165)
        at 
org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64)
        at 
org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64)
        at 
org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:88)
        at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)
        at 
org.apache.spark.sql.execution.joins.BroadcastHashJoin$$anonfun$2.apply(BroadcastHashJoin.scala:113)
        at 
org.apache.spark.sql.execution.joins.BroadcastHashJoin$$anonfun$2.apply(BroadcastHashJoin.scala:112)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$17.apply(RDD.scala:706)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$17.apply(RDD.scala:706)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
        at 
org.apache.spark.rdd.MapPartitionsWithPreparationRDD.compute(MapPartitionsWithPreparationRDD.scala:46)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
        at 
org.apache.spark.sql.execution.joins.UnsafeHashedRelation$$anonfun$writeExternal$1.apply$mcV$sp(HashedRelation.scala:250)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1160)
        ... 57 more

{code}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to