Repository: spark
Updated Branches:
  refs/heads/master c2d160fbe -> ca11919e6


[SPARK-1403] Move the class loader creation back to where it was in 0.9.0

[SPARK-1403] I investigated why spark 0.9.0 loads fine on mesos while spark 
1.0.0 fails. What I found was that in SparkEnv.scala, while creating the 
SparkEnv object, the current thread's classloader is null. But in 0.9.0, at the 
same place, it is set to org.apache.spark.repl.ExecutorClassLoader . I saw that 
https://github.com/apache/spark/commit/7edbea41b43e0dc11a2de156be220db8b7952d01 
moved it to it current place. I moved it back and saw that 1.0.0 started 
working fine on mesos.

I just created a minimal patch that allows me to run spark on mesos correctly. 
It seems like SecurityManager's creation needs to be taken into account for a 
correct fix. Also moving the creation of the serializer out of SparkEnv might 
be a part of the right solution. PTAL.

Author: Bharath Bhushan <manku.ti...@outlook.com>

Closes #322 from manku-timma/spark-1403 and squashes the following commits:

606c2b9 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into 
spark-1403
ec8f870 [Bharath Bhushan] revert the logger change for java 6 compatibility as 
PR 334 is doing it
728beca [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into 
spark-1403
044027d [Bharath Bhushan] fix compile error
6f260a4 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into 
spark-1403
b3a053f [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into 
spark-1403
04b9662 [Bharath Bhushan] add missing line
4803c19 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into 
spark-1403
f3c9a14 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into 
spark-1403
42d3d6a [Bharath Bhushan] used code fragment from @ueshin to fix the problem in 
a better way
89109d7 [Bharath Bhushan] move the class loader creation back to where it was 
in 0.9.0


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/ca11919e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/ca11919e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/ca11919e

Branch: refs/heads/master
Commit: ca11919e6e97a62eb3e3ce882ffa29eae36f50f7
Parents: c2d160f
Author: Bharath Bhushan <manku.ti...@outlook.com>
Authored: Sat Apr 12 20:52:29 2014 -0700
Committer: Patrick Wendell <pwend...@gmail.com>
Committed: Sat Apr 12 20:53:44 2014 -0700

----------------------------------------------------------------------
 .../spark/executor/MesosExecutorBackend.scala   | 22 +++++++++++++-------
 1 file changed, 15 insertions(+), 7 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/ca11919e/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala 
b/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
index 6fc702f..df36a06 100644
--- a/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
+++ b/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
@@ -50,13 +50,21 @@ private[spark] class MesosExecutorBackend
       executorInfo: ExecutorInfo,
       frameworkInfo: FrameworkInfo,
       slaveInfo: SlaveInfo) {
-    logInfo("Registered with Mesos as executor ID " + 
executorInfo.getExecutorId.getValue)
-    this.driver = driver
-    val properties = Utils.deserialize[Array[(String, 
String)]](executorInfo.getData.toByteArray)
-    executor = new Executor(
-      executorInfo.getExecutorId.getValue,
-      slaveInfo.getHostname,
-      properties)
+    val cl = Thread.currentThread.getContextClassLoader
+    try {
+      // Work around for SPARK-1480
+      Thread.currentThread.setContextClassLoader(getClass.getClassLoader)
+      logInfo("Registered with Mesos as executor ID " + 
executorInfo.getExecutorId.getValue)
+      this.driver = driver
+      val properties = Utils.deserialize[Array[(String, 
String)]](executorInfo.getData.toByteArray)
+      executor = new Executor(
+        executorInfo.getExecutorId.getValue,
+        slaveInfo.getHostname,
+        properties)
+    } finally {
+      // Work around for SPARK-1480
+      Thread.currentThread.setContextClassLoader(cl)
+    }
   }
 
   override def launchTask(d: ExecutorDriver, taskInfo: TaskInfo) {

Reply via email to