Repository: spark
Updated Branches:
  refs/heads/master 06dc4b520 -> 0a597276d


[Minor] Fix the value represented by spark.executor.id for consistency.

The property  `spark.executor.id` can represent both `driver` and `<driver>`  
for one driver.
It's inconsistent.

This issue is minor so I didn't file this in JIRA.

Author: Kousuke Saruta <[email protected]>

Closes #3812 from sarutak/fix-driver-identifier and squashes the following 
commits:

d885498 [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark 
into fix-driver-identifier
4275663 [Kousuke Saruta] Fixed the value represented by spark.executor.id of 
local mode


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0a597276
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0a597276
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0a597276

Branch: refs/heads/master
Commit: 0a597276dbfd921665a9f720d96e119058186aa4
Parents: 06dc4b5
Author: Kousuke Saruta <[email protected]>
Authored: Thu Jan 8 11:35:56 2015 -0800
Committer: Andrew Or <[email protected]>
Committed: Thu Jan 8 11:35:56 2015 -0800

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/SparkContext.scala | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/0a597276/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala 
b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 4c25d5d..3bf3acd 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -229,7 +229,7 @@ class SparkContext(config: SparkConf) extends Logging with 
ExecutorAllocationCli
   // An asynchronous listener bus for Spark events
   private[spark] val listenerBus = new LiveListenerBus
 
-  conf.set("spark.executor.id", "driver")
+  conf.set("spark.executor.id", SparkContext.DRIVER_IDENTIFIER)
 
   // Create the Spark execution environment (cache, map output tracker, etc)
   private[spark] val env = SparkEnv.createDriverEnv(conf, isLocal, listenerBus)


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to