Repository: spark
Updated Branches:
  refs/heads/master 957a8ab37 -> 7c27d075c


[SPARK-16812] Open up SparkILoop.getAddedJars

## What changes were proposed in this pull request?
This patch makes SparkILoop.getAddedJars a public developer API. It is a useful 
function to get the list of jars added.

## How was this patch tested?
N/A - this is a simple visibility change.

Author: Reynold Xin <r...@databricks.com>

Closes #14417 from rxin/SPARK-16812.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7c27d075
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7c27d075
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7c27d075

Branch: refs/heads/master
Commit: 7c27d075c39ebaf3e762284e2536fe7be0e3da87
Parents: 957a8ab
Author: Reynold Xin <r...@databricks.com>
Authored: Sat Jul 30 23:05:03 2016 -0700
Committer: Reynold Xin <r...@databricks.com>
Committed: Sat Jul 30 23:05:03 2016 -0700

----------------------------------------------------------------------
 .../src/main/scala/org/apache/spark/repl/SparkILoop.scala         | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/7c27d075/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala
----------------------------------------------------------------------
diff --git 
a/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala 
b/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala
index 16f330a..e017aa4 100644
--- a/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala
+++ b/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala
@@ -1059,7 +1059,8 @@ class SparkILoop(
   @deprecated("Use `process` instead", "2.9.0")
   private def main(settings: Settings): Unit = process(settings)
 
-  private[repl] def getAddedJars(): Array[String] = {
+  @DeveloperApi
+  def getAddedJars(): Array[String] = {
     val conf = new SparkConf().setMaster(getMaster())
     val envJars = sys.env.get("ADD_JARS")
     if (envJars.isDefined) {


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to