Repository: spark
Updated Branches:
  refs/heads/master 73b0cbcc2 -> 3292e2a71


SPARK-1721: Reset the thread classLoader in the Mesos Executor

This is because Mesos calls it with a different environment or something, the 
result is that the Spark jar is missing and it can't load classes.

This fixes 
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-ClassNotFoundException-spark-on-mesos-td3510.html

I have no idea whether this is the right fix, I can only confirm that it fixes 
the issue for us.

The `registered` method is called from mesos 
(https://github.com/apache/mesos/blob/765ff9bc2ac5a12d4362f8235b572a37d646390a/src/java/jni/org_apache_mesos_MesosExecutorDriver.cpp)

I am unsure which commit caused this regression

Author: Bouke van der Bijl <boukevanderb...@gmail.com>

Closes #620 from bouk/mesos-classloader-fix and squashes the following commits:

c13eae0 [Bouke van der Bijl] Use getContextOrSparkClassLoader in SparkEnv and 
CompressionCodec


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/3292e2a7
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/3292e2a7
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/3292e2a7

Branch: refs/heads/master
Commit: 3292e2a71bfb5df5ba156cf7557747d164d12291
Parents: 73b0cbc
Author: Bouke van der Bijl <boukevanderb...@gmail.com>
Authored: Mon May 5 11:19:35 2014 -0700
Committer: Patrick Wendell <pwend...@gmail.com>
Committed: Mon May 5 11:19:36 2014 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/SparkEnv.scala            | 4 +---
 core/src/main/scala/org/apache/spark/io/CompressionCodec.scala | 3 ++-
 2 files changed, 3 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/3292e2a7/core/src/main/scala/org/apache/spark/SparkEnv.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkEnv.scala 
b/core/src/main/scala/org/apache/spark/SparkEnv.scala
index bea435e..d40ed27 100644
--- a/core/src/main/scala/org/apache/spark/SparkEnv.scala
+++ b/core/src/main/scala/org/apache/spark/SparkEnv.scala
@@ -156,13 +156,11 @@ object SparkEnv extends Logging {
       conf.set("spark.driver.port",  boundPort.toString)
     }
 
-    val classLoader = Thread.currentThread.getContextClassLoader
-
     // Create an instance of the class named by the given Java system 
property, or by
     // defaultClassName if the property is not set, and return it as a T
     def instantiateClass[T](propertyName: String, defaultClassName: String): T 
= {
       val name = conf.get(propertyName,  defaultClassName)
-      val cls = Class.forName(name, true, classLoader)
+      val cls = Class.forName(name, true, Utils.getContextOrSparkClassLoader)
       // First try with the constructor that takes SparkConf. If we can't find 
one,
       // use a no-arg constructor instead.
       try {

http://git-wip-us.apache.org/repos/asf/spark/blob/3292e2a7/core/src/main/scala/org/apache/spark/io/CompressionCodec.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/io/CompressionCodec.scala 
b/core/src/main/scala/org/apache/spark/io/CompressionCodec.scala
index e1a5ee3..4b0fe1a 100644
--- a/core/src/main/scala/org/apache/spark/io/CompressionCodec.scala
+++ b/core/src/main/scala/org/apache/spark/io/CompressionCodec.scala
@@ -24,6 +24,7 @@ import org.xerial.snappy.{SnappyInputStream, 
SnappyOutputStream}
 
 import org.apache.spark.SparkConf
 import org.apache.spark.annotation.DeveloperApi
+import org.apache.spark.util.Utils
 
 /**
  * :: DeveloperApi ::
@@ -49,7 +50,7 @@ private[spark] object CompressionCodec {
   }
 
   def createCodec(conf: SparkConf, codecName: String): CompressionCodec = {
-    val ctor = Class.forName(codecName, true, 
Thread.currentThread.getContextClassLoader)
+    val ctor = Class.forName(codecName, true, 
Utils.getContextOrSparkClassLoader)
       .getConstructor(classOf[SparkConf])
     ctor.newInstance(conf).asInstanceOf[CompressionCodec]
   }

Reply via email to