Repository: spark
Updated Branches:
  refs/heads/branch-1.0 a56621622 -> 23cc40e39


[SPARK-1897] Respect spark.jars (and --jars) in spark-shell

Spark shell currently overwrites `spark.jars` with `ADD_JARS`. In all modes 
except yarn-cluster, this means the `--jar` flag passed to `bin/spark-shell` is 
also discarded. However, in the 
[docs](http://people.apache.org/~pwendell/spark-1.0.0-rc7-docs/scala-programming-guide.html#initializing-spark),
 we explicitly tell the users to add the jars this way.

Author: Andrew Or <andrewo...@gmail.com>

Closes #849 from andrewor14/shell-jars and squashes the following commits:

928a7e6 [Andrew Or] ',' -> "," (minor)
afc357c [Andrew Or] Handle spark.jars == "" in SparkILoop, not SparkSubmit
c6da113 [Andrew Or] Do not set spark.jars to ""
d8549f7 [Andrew Or] Respect spark.jars and --jars in spark-shell

(cherry picked from commit 8edbee7d1b4afc192d97ba192a5526affc464205)
Signed-off-by: Tathagata Das <tathagata.das1...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/23cc40e3
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/23cc40e3
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/23cc40e3

Branch: refs/heads/branch-1.0
Commit: 23cc40e39acc598816ac46b36463e13587a0dd60
Parents: a566216
Author: Andrew Or <andrewo...@gmail.com>
Authored: Thu May 22 20:25:41 2014 -0700
Committer: Tathagata Das <tathagata.das1...@gmail.com>
Committed: Thu May 22 20:25:53 2014 -0700

----------------------------------------------------------------------
 repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala | 8 +++++++-
 1 file changed, 7 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/23cc40e3/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
----------------------------------------------------------------------
diff --git a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala 
b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
index 296da74..55684e9 100644
--- a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
+++ b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
@@ -993,7 +993,13 @@ object SparkILoop {
   implicit def loopToInterpreter(repl: SparkILoop): SparkIMain = repl.intp
   private def echo(msg: String) = Console println msg
 
-  def getAddedJars: Array[String] = 
Option(System.getenv("ADD_JARS")).map(_.split(',')).getOrElse(new 
Array[String](0))
+  def getAddedJars: Array[String] = {
+    val envJars = sys.env.get("ADD_JARS")
+    val propJars = sys.props.get("spark.jars").flatMap { p =>
+      if (p == "") None else Some(p)
+    }
+    propJars.orElse(envJars).map(_.split(",")).getOrElse(Array.empty)
+  }
 
   // Designed primarily for use by test code: take a String with a
   // bunch of code, and prints out a transcript of what it would look

Reply via email to