Repository: spark
Updated Branches:
  refs/heads/branch-1.0 1c6c8b5bd -> 6cbe2a37c


[Spark 1877] ClassNotFoundException when loading RDD with serialized objects

Updated version of #821

Author: Tathagata Das <tathagata.das1...@gmail.com>
Author: Ghidireac <bogd...@u448a5b0a73d45358d94a.ant.amazon.com>

Closes #835 from tdas/SPARK-1877 and squashes the following commits:

f346f71 [Tathagata Das] Addressed Patrick's comments.
fee0c5d [Ghidireac] SPARK-1877: ClassNotFoundException when loading RDD with 
serialized objects

(cherry picked from commit 52eb54d02403a3c37d84b9da7cc1cdb261048cf8)
Signed-off-by: Tathagata Das <tathagata.das1...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6cbe2a37
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/6cbe2a37
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/6cbe2a37

Branch: refs/heads/branch-1.0
Commit: 6cbe2a37ccb14f65b6d6b813a585adbbc43684c4
Parents: 1c6c8b5
Author: Tathagata Das <tathagata.das1...@gmail.com>
Authored: Mon May 19 22:36:24 2014 -0700
Committer: Tathagata Das <tathagata.das1...@gmail.com>
Committed: Mon May 19 22:36:37 2014 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/SparkContext.scala | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/6cbe2a37/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala 
b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 634c10c..49737fa 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -718,7 +718,7 @@ class SparkContext(config: SparkConf) extends Logging {
       minPartitions: Int = defaultMinPartitions
       ): RDD[T] = {
     sequenceFile(path, classOf[NullWritable], classOf[BytesWritable], 
minPartitions)
-      .flatMap(x => Utils.deserialize[Array[T]](x._2.getBytes))
+      .flatMap(x => Utils.deserialize[Array[T]](x._2.getBytes, 
Utils.getContextOrSparkClassLoader))
   }
 
   protected[spark] def checkpointFile[T: ClassTag](

Reply via email to