spark git commit: [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url

2018-01-10 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 9b33dfc40 -> a6647ffbf


[SPARK-22587] Spark job fails if fs.defaultFS and application jar are different 
url

## What changes were proposed in this pull request?

Two filesystems comparing does not consider the authority of URI. This is 
specific for
WASB file storage system, where userInfo is honored to differentiate 
filesystems.
For example: wasbs://user1xyz.net, wasbs://user2xyz.net would consider as two 
filesystem.
Therefore, we have to add the authority to compare two filesystem, and  two 
filesystem with different authority can not be the same FS.

Please review http://spark.apache.org/contributing.html before opening a pull 
request.

Author: Mingjie Tang 

Closes #19885 from merlintang/EAR-7377.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/a6647ffb
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/a6647ffb
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/a6647ffb

Branch: refs/heads/master
Commit: a6647ffbf7a312a3e119a9beef90880cc915aa60
Parents: 9b33dfc
Author: Mingjie Tang 
Authored: Thu Jan 11 11:51:03 2018 +0800
Committer: jerryshao 
Committed: Thu Jan 11 11:51:03 2018 +0800

--
 .../org/apache/spark/deploy/yarn/Client.scala   | 24 +++---
 .../apache/spark/deploy/yarn/ClientSuite.scala  | 33 
 2 files changed, 53 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/a6647ffb/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
--
diff --git 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
index 15328d0..8cd3cd9 100644
--- 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
+++ 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
@@ -1421,15 +1421,20 @@ private object Client extends Logging {
   }
 
   /**
-   * Return whether the two file systems are the same.
+   * Return whether two URI represent file system are the same
*/
-  private def compareFs(srcFs: FileSystem, destFs: FileSystem): Boolean = {
-val srcUri = srcFs.getUri()
-val dstUri = destFs.getUri()
+  private[spark] def compareUri(srcUri: URI, dstUri: URI): Boolean = {
+
 if (srcUri.getScheme() == null || srcUri.getScheme() != 
dstUri.getScheme()) {
   return false
 }
 
+val srcAuthority = srcUri.getAuthority()
+val dstAuthority = dstUri.getAuthority()
+if (srcAuthority != null && !srcAuthority.equalsIgnoreCase(dstAuthority)) {
+  return false
+}
+
 var srcHost = srcUri.getHost()
 var dstHost = dstUri.getHost()
 
@@ -1447,6 +1452,17 @@ private object Client extends Logging {
 }
 
 Objects.equal(srcHost, dstHost) && srcUri.getPort() == dstUri.getPort()
+
+  }
+
+  /**
+   * Return whether the two file systems are the same.
+   */
+  protected def compareFs(srcFs: FileSystem, destFs: FileSystem): Boolean = {
+val srcUri = srcFs.getUri()
+val dstUri = destFs.getUri()
+
+compareUri(srcUri, dstUri)
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/spark/blob/a6647ffb/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
--
diff --git 
a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
 
b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
index 9d5f5eb..7fa5971 100644
--- 
a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
+++ 
b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
@@ -357,6 +357,39 @@ class ClientSuite extends SparkFunSuite with Matchers {
 sparkConf.get(SECONDARY_JARS) should be (Some(Seq(new 
File(jar2.toURI).getName)))
   }
 
+  private val matching = Seq(
+("files URI match test1", "file:///file1", "file:///file2"),
+("files URI match test2", "file:///c:file1", "file://c:file2"),
+("files URI match test3", "file://host/file1", "file://host/file2"),
+("wasb URI match test", "wasb://bucket1@user", "wasb://bucket1@user/"),
+("hdfs URI match test", "hdfs:/path1", "hdfs:/path1")
+  )
+
+  matching.foreach { t =>
+  test(t._1) {
+assert(Client.compareUri(new URI(t._2), new URI(t._3)),
+  s"No match between ${t._2} and ${t._3}")
+  }
+  }
+
+  private val unmatching = Seq(
+("files URI unmatch test1", "file:///file1", "file://host/file2"),
+("files URI unmatch test2", "file://host/file1", "file:///file2"),
+("files URI unmatch test3", "file://host/file1

spark git commit: [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url

2018-01-10 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/branch-2.3 551ccfba5 -> 317b0aaed


[SPARK-22587] Spark job fails if fs.defaultFS and application jar are different 
url

## What changes were proposed in this pull request?

Two filesystems comparing does not consider the authority of URI. This is 
specific for
WASB file storage system, where userInfo is honored to differentiate 
filesystems.
For example: wasbs://user1xyz.net, wasbs://user2xyz.net would consider as two 
filesystem.
Therefore, we have to add the authority to compare two filesystem, and  two 
filesystem with different authority can not be the same FS.

Please review http://spark.apache.org/contributing.html before opening a pull 
request.

Author: Mingjie Tang 

Closes #19885 from merlintang/EAR-7377.

(cherry picked from commit a6647ffbf7a312a3e119a9beef90880cc915aa60)
Signed-off-by: jerryshao 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/317b0aae
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/317b0aae
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/317b0aae

Branch: refs/heads/branch-2.3
Commit: 317b0aaed83e4bbf66f63ddc0d618da9f1f85085
Parents: 551ccfb
Author: Mingjie Tang 
Authored: Thu Jan 11 11:51:03 2018 +0800
Committer: jerryshao 
Committed: Thu Jan 11 11:51:34 2018 +0800

--
 .../org/apache/spark/deploy/yarn/Client.scala   | 24 +++---
 .../apache/spark/deploy/yarn/ClientSuite.scala  | 33 
 2 files changed, 53 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/317b0aae/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
--
diff --git 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
index 15328d0..8cd3cd9 100644
--- 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
+++ 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
@@ -1421,15 +1421,20 @@ private object Client extends Logging {
   }
 
   /**
-   * Return whether the two file systems are the same.
+   * Return whether two URI represent file system are the same
*/
-  private def compareFs(srcFs: FileSystem, destFs: FileSystem): Boolean = {
-val srcUri = srcFs.getUri()
-val dstUri = destFs.getUri()
+  private[spark] def compareUri(srcUri: URI, dstUri: URI): Boolean = {
+
 if (srcUri.getScheme() == null || srcUri.getScheme() != 
dstUri.getScheme()) {
   return false
 }
 
+val srcAuthority = srcUri.getAuthority()
+val dstAuthority = dstUri.getAuthority()
+if (srcAuthority != null && !srcAuthority.equalsIgnoreCase(dstAuthority)) {
+  return false
+}
+
 var srcHost = srcUri.getHost()
 var dstHost = dstUri.getHost()
 
@@ -1447,6 +1452,17 @@ private object Client extends Logging {
 }
 
 Objects.equal(srcHost, dstHost) && srcUri.getPort() == dstUri.getPort()
+
+  }
+
+  /**
+   * Return whether the two file systems are the same.
+   */
+  protected def compareFs(srcFs: FileSystem, destFs: FileSystem): Boolean = {
+val srcUri = srcFs.getUri()
+val dstUri = destFs.getUri()
+
+compareUri(srcUri, dstUri)
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/spark/blob/317b0aae/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
--
diff --git 
a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
 
b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
index 9d5f5eb..7fa5971 100644
--- 
a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
+++ 
b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
@@ -357,6 +357,39 @@ class ClientSuite extends SparkFunSuite with Matchers {
 sparkConf.get(SECONDARY_JARS) should be (Some(Seq(new 
File(jar2.toURI).getName)))
   }
 
+  private val matching = Seq(
+("files URI match test1", "file:///file1", "file:///file2"),
+("files URI match test2", "file:///c:file1", "file://c:file2"),
+("files URI match test3", "file://host/file1", "file://host/file2"),
+("wasb URI match test", "wasb://bucket1@user", "wasb://bucket1@user/"),
+("hdfs URI match test", "hdfs:/path1", "hdfs:/path1")
+  )
+
+  matching.foreach { t =>
+  test(t._1) {
+assert(Client.compareUri(new URI(t._2), new URI(t._3)),
+  s"No match between ${t._2} and ${t._3}")
+  }
+  }
+
+  private val unmatching = Seq(
+("files URI unmatch test1", "file:///file1", "file://host/file2"),
+("files URI un

spark git commit: [SPARK-22976][CORE] Cluster mode driver dir removed while running

2018-01-21 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 602c6d82d -> 11daeb833


[SPARK-22976][CORE] Cluster mode driver dir removed while running

## What changes were proposed in this pull request?

The clean up logic on the worker perviously determined the liveness of a
particular applicaiton based on whether or not it had running executors.
This would fail in the case that a directory was made for a driver
running in cluster mode if that driver had no running executors on the
same machine. To preserve driver directories we consider both executors
and running drivers when checking directory liveness.

## How was this patch tested?

Manually started up two node cluster with a single core on each node. Turned on 
worker directory cleanup and set the interval to 1 second and liveness to one 
second. Without the patch the driver directory is removed immediately after the 
app is launched. With the patch it is not

### Without Patch
```
INFO  2018-01-05 23:48:24,693 Logging.scala:54 - Asked to launch driver 
driver-20180105234824-
INFO  2018-01-05 23:48:25,293 Logging.scala:54 - Changing view acls to: 
cassandra
INFO  2018-01-05 23:48:25,293 Logging.scala:54 - Changing modify acls to: 
cassandra
INFO  2018-01-05 23:48:25,294 Logging.scala:54 - Changing view acls groups to:
INFO  2018-01-05 23:48:25,294 Logging.scala:54 - Changing modify acls groups to:
INFO  2018-01-05 23:48:25,294 Logging.scala:54 - SecurityManager: 
authentication disabled; ui acls disabled; users  with view permissions: 
Set(cassandra); groups with view permissions: Set(); users  with modify 
permissions: Set(cassandra); groups with modify permissions: Set()
INFO  2018-01-05 23:48:25,330 Logging.scala:54 - Copying user jar 
file:/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180105234824-/writeRead-0.1.jar
INFO  2018-01-05 23:48:25,332 Logging.scala:54 - Copying 
/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180105234824-/writeRead-0.1.jar
INFO  2018-01-05 23:48:25,361 Logging.scala:54 - Launch Command: 
"/usr/lib/jvm/jdk1.8.0_40//bin/java" 

INFO  2018-01-05 23:48:56,577 Logging.scala:54 - Removing directory: 
/var/lib/spark/worker/driver-20180105234824-  ### << Cleaned up

--
One minute passes while app runs (app has 1 minute sleep built in)
--

WARN  2018-01-05 23:49:58,080 ShuffleSecretManager.java:73 - Attempted to 
unregister application app-20180105234831- when it is not registered
INFO  2018-01-05 23:49:58,081 ExternalShuffleBlockResolver.java:163 - 
Application app-20180105234831- removed, cleanupLocalDirs = false
INFO  2018-01-05 23:49:58,081 ExternalShuffleBlockResolver.java:163 - 
Application app-20180105234831- removed, cleanupLocalDirs = false
INFO  2018-01-05 23:49:58,082 ExternalShuffleBlockResolver.java:163 - 
Application app-20180105234831- removed, cleanupLocalDirs = true
INFO  2018-01-05 23:50:00,999 Logging.scala:54 - Driver 
driver-20180105234824- exited successfully
```

With Patch
```
INFO  2018-01-08 23:19:54,603 Logging.scala:54 - Asked to launch driver 
driver-20180108231954-0002
INFO  2018-01-08 23:19:54,975 Logging.scala:54 - Changing view acls to: 
automaton
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - Changing modify acls to: 
automaton
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - Changing view acls groups to:
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - Changing modify acls groups to:
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - SecurityManager: 
authentication disabled; ui acls disabled; users  with view permissions: 
Set(automaton); groups with view permissions: Set(); users  with modify 
permissions: Set(automaton); groups with modify permissions: Set()
INFO  2018-01-08 23:19:55,029 Logging.scala:54 - Copying user jar 
file:/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180108231954-0002/writeRead-0.1.jar
INFO  2018-01-08 23:19:55,031 Logging.scala:54 - Copying 
/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180108231954-0002/writeRead-0.1.jar
INFO  2018-01-08 23:19:55,038 Logging.scala:54 - Launch Command: ..
INFO  2018-01-08 23:21:28,674 ShuffleSecretManager.java:69 - Unregistered 
shuffle secret for application app-20180108232000-
INFO  2018-01-08 23:21:28,675 ExternalShuffleBlockResolver.java:163 - 
Application app-20180108232000- removed, cleanupLocalDirs = false
INFO  2018-01-08 23:21:28,675 ExternalShuffleBlockResolver.java:163 - 
Application app-20180108232000- removed, cleanupLocalDirs = false
INFO  2018-01-08 23:21:28,681 ExternalShuffleBlockResolver.java:163 - 
Application app-20180108232000- removed, cleanupLocalDirs = true
INFO  2018-01-08 23:21:31,703 Logging.scala:54 - Driver 
driver-20180108231954-0002 exited successfully
*
INFO  2018-01-08 23:21:32,346 Logging.scala:54 - Removing directory: 
/var/lib/spark/worker/driver-20180108231954-0002 ### < Happening AFTER the Run 
completes rather than during it
*
```

A

spark git commit: [SPARK-22976][CORE] Cluster mode driver dir removed while running

2018-01-21 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/branch-2.3 7520491bf -> 5781fa79e


[SPARK-22976][CORE] Cluster mode driver dir removed while running

## What changes were proposed in this pull request?

The clean up logic on the worker perviously determined the liveness of a
particular applicaiton based on whether or not it had running executors.
This would fail in the case that a directory was made for a driver
running in cluster mode if that driver had no running executors on the
same machine. To preserve driver directories we consider both executors
and running drivers when checking directory liveness.

## How was this patch tested?

Manually started up two node cluster with a single core on each node. Turned on 
worker directory cleanup and set the interval to 1 second and liveness to one 
second. Without the patch the driver directory is removed immediately after the 
app is launched. With the patch it is not

### Without Patch
```
INFO  2018-01-05 23:48:24,693 Logging.scala:54 - Asked to launch driver 
driver-20180105234824-
INFO  2018-01-05 23:48:25,293 Logging.scala:54 - Changing view acls to: 
cassandra
INFO  2018-01-05 23:48:25,293 Logging.scala:54 - Changing modify acls to: 
cassandra
INFO  2018-01-05 23:48:25,294 Logging.scala:54 - Changing view acls groups to:
INFO  2018-01-05 23:48:25,294 Logging.scala:54 - Changing modify acls groups to:
INFO  2018-01-05 23:48:25,294 Logging.scala:54 - SecurityManager: 
authentication disabled; ui acls disabled; users  with view permissions: 
Set(cassandra); groups with view permissions: Set(); users  with modify 
permissions: Set(cassandra); groups with modify permissions: Set()
INFO  2018-01-05 23:48:25,330 Logging.scala:54 - Copying user jar 
file:/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180105234824-/writeRead-0.1.jar
INFO  2018-01-05 23:48:25,332 Logging.scala:54 - Copying 
/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180105234824-/writeRead-0.1.jar
INFO  2018-01-05 23:48:25,361 Logging.scala:54 - Launch Command: 
"/usr/lib/jvm/jdk1.8.0_40//bin/java" 

INFO  2018-01-05 23:48:56,577 Logging.scala:54 - Removing directory: 
/var/lib/spark/worker/driver-20180105234824-  ### << Cleaned up

--
One minute passes while app runs (app has 1 minute sleep built in)
--

WARN  2018-01-05 23:49:58,080 ShuffleSecretManager.java:73 - Attempted to 
unregister application app-20180105234831- when it is not registered
INFO  2018-01-05 23:49:58,081 ExternalShuffleBlockResolver.java:163 - 
Application app-20180105234831- removed, cleanupLocalDirs = false
INFO  2018-01-05 23:49:58,081 ExternalShuffleBlockResolver.java:163 - 
Application app-20180105234831- removed, cleanupLocalDirs = false
INFO  2018-01-05 23:49:58,082 ExternalShuffleBlockResolver.java:163 - 
Application app-20180105234831- removed, cleanupLocalDirs = true
INFO  2018-01-05 23:50:00,999 Logging.scala:54 - Driver 
driver-20180105234824- exited successfully
```

With Patch
```
INFO  2018-01-08 23:19:54,603 Logging.scala:54 - Asked to launch driver 
driver-20180108231954-0002
INFO  2018-01-08 23:19:54,975 Logging.scala:54 - Changing view acls to: 
automaton
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - Changing modify acls to: 
automaton
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - Changing view acls groups to:
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - Changing modify acls groups to:
INFO  2018-01-08 23:19:54,976 Logging.scala:54 - SecurityManager: 
authentication disabled; ui acls disabled; users  with view permissions: 
Set(automaton); groups with view permissions: Set(); users  with modify 
permissions: Set(automaton); groups with modify permissions: Set()
INFO  2018-01-08 23:19:55,029 Logging.scala:54 - Copying user jar 
file:/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180108231954-0002/writeRead-0.1.jar
INFO  2018-01-08 23:19:55,031 Logging.scala:54 - Copying 
/home/automaton/writeRead-0.1.jar to 
/var/lib/spark/worker/driver-20180108231954-0002/writeRead-0.1.jar
INFO  2018-01-08 23:19:55,038 Logging.scala:54 - Launch Command: ..
INFO  2018-01-08 23:21:28,674 ShuffleSecretManager.java:69 - Unregistered 
shuffle secret for application app-20180108232000-
INFO  2018-01-08 23:21:28,675 ExternalShuffleBlockResolver.java:163 - 
Application app-20180108232000- removed, cleanupLocalDirs = false
INFO  2018-01-08 23:21:28,675 ExternalShuffleBlockResolver.java:163 - 
Application app-20180108232000- removed, cleanupLocalDirs = false
INFO  2018-01-08 23:21:28,681 ExternalShuffleBlockResolver.java:163 - 
Application app-20180108232000- removed, cleanupLocalDirs = true
INFO  2018-01-08 23:21:31,703 Logging.scala:54 - Driver 
driver-20180108231954-0002 exited successfully
*
INFO  2018-01-08 23:21:32,346 Logging.scala:54 - Removing directory: 
/var/lib/spark/worker/driver-20180108231954-0002 ### < Happening AFTER the Run 
completes rather than during it
*
``

spark git commit: [MINOR][DOC] Fix the path to the examples jar

2018-01-22 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master ec2289761 -> 60175e959


[MINOR][DOC] Fix the path to the examples jar

## What changes were proposed in this pull request?

The example jar file is now in ./examples/jars directory of Spark distribution.

Author: Arseniy Tashoyan 

Closes #20349 from tashoyan/patch-1.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/60175e95
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/60175e95
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/60175e95

Branch: refs/heads/master
Commit: 60175e959f275d2961798fbc5a9150dac9de51ff
Parents: ec22897
Author: Arseniy Tashoyan 
Authored: Mon Jan 22 20:17:05 2018 +0800
Committer: jerryshao 
Committed: Mon Jan 22 20:17:05 2018 +0800

--
 docs/running-on-yarn.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/60175e95/docs/running-on-yarn.md
--
diff --git a/docs/running-on-yarn.md b/docs/running-on-yarn.md
index e4f5a0c..c010af3 100644
--- a/docs/running-on-yarn.md
+++ b/docs/running-on-yarn.md
@@ -35,7 +35,7 @@ For example:
 --executor-memory 2g \
 --executor-cores 1 \
 --queue thequeue \
-lib/spark-examples*.jar \
+examples/jars/spark-examples*.jar \
 10
 
 The above starts a YARN client program which starts the default Application 
Master. Then SparkPi will be run as a child thread of Application Master. The 
client will periodically poll the Application Master for status updates and 
display them in the console. The client will exit once your application has 
finished running.  Refer to the "Debugging your Application" section below for 
how to see driver and executor logs.


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [MINOR][DOC] Fix the path to the examples jar

2018-01-22 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/branch-2.3 57c320a0d -> cf078a205


[MINOR][DOC] Fix the path to the examples jar

## What changes were proposed in this pull request?

The example jar file is now in ./examples/jars directory of Spark distribution.

Author: Arseniy Tashoyan 

Closes #20349 from tashoyan/patch-1.

(cherry picked from commit 60175e959f275d2961798fbc5a9150dac9de51ff)
Signed-off-by: jerryshao 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/cf078a20
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/cf078a20
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/cf078a20

Branch: refs/heads/branch-2.3
Commit: cf078a205a14d8709e2c4a9d9f23f6efa20b4fe7
Parents: 57c320a
Author: Arseniy Tashoyan 
Authored: Mon Jan 22 20:17:05 2018 +0800
Committer: jerryshao 
Committed: Mon Jan 22 20:20:45 2018 +0800

--
 docs/running-on-yarn.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/cf078a20/docs/running-on-yarn.md
--
diff --git a/docs/running-on-yarn.md b/docs/running-on-yarn.md
index e4f5a0c..c010af3 100644
--- a/docs/running-on-yarn.md
+++ b/docs/running-on-yarn.md
@@ -35,7 +35,7 @@ For example:
 --executor-memory 2g \
 --executor-cores 1 \
 --queue thequeue \
-lib/spark-examples*.jar \
+examples/jars/spark-examples*.jar \
 10
 
 The above starts a YARN client program which starts the default Application 
Master. Then SparkPi will be run as a child thread of Application Master. The 
client will periodically poll the Application Master for status updates and 
display them in the console. The client will exit once your application has 
finished running.  Refer to the "Debugging your Application" section below for 
how to see driver and executor logs.


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore

2018-01-25 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 70a68b328 -> d1721816d


[SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore

## What changes were proposed in this pull request?

When using the Kubernetes cluster-manager and spawning a Streaming workload, it 
is important to reset many spark.kubernetes.* properties that are generated by 
spark-submit but which would get rewritten when restoring a Checkpoint. This is 
so, because the spark-submit codepath creates Kubernetes resources, such as a 
ConfigMap, a Secret and other variables, which have an autogenerated name and 
the previous one will not resolve anymore.

In short, this change enables checkpoint restoration for streaming workloads, 
and thus enables Spark Streaming workloads in Kubernetes, which were not 
possible to restore from a checkpoint before if the workload went down.

## How was this patch tested?

This patch was tested with the twitter-streaming example in AWS, using 
checkpoints in s3 with the s3a:// protocol, as supported by Hadoop.

This is similar to the YARN related code for resetting a Spark Streaming 
workload, but for the Kubernetes scheduler. I'm adding the initcontainers 
properties because even if the discussion is not completely settled on the 
mailing list, my understanding is that at this moment they are going forward 
for the moment.

For a previous discussion, see the non-rebased work at: 
https://github.com/apache-spark-on-k8s/spark/pull/516

Author: Santiago Saavedra 

Closes #20383 from ssaavedra/fix-k8s-checkpointing.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/d1721816
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/d1721816
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/d1721816

Branch: refs/heads/master
Commit: d1721816d26bedee3c72eeb75db49da500568376
Parents: 70a68b3
Author: Santiago Saavedra 
Authored: Fri Jan 26 15:24:06 2018 +0800
Committer: jerryshao 
Committed: Fri Jan 26 15:24:06 2018 +0800

--
 .../org/apache/spark/streaming/Checkpoint.scala | 16 
 1 file changed, 16 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/d1721816/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
--
diff --git 
a/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala 
b/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
index aed67a5..ed2a896 100644
--- a/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
+++ b/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
@@ -53,6 +53,21 @@ class Checkpoint(ssc: StreamingContext, val checkpointTime: 
Time)
   "spark.driver.host",
   "spark.driver.bindAddress",
   "spark.driver.port",
+  "spark.kubernetes.driver.pod.name",
+  "spark.kubernetes.executor.podNamePrefix",
+  "spark.kubernetes.initcontainer.executor.configmapname",
+  "spark.kubernetes.initcontainer.executor.configmapkey",
+  "spark.kubernetes.initcontainer.downloadJarsResourceIdentifier",
+  "spark.kubernetes.initcontainer.downloadJarsSecretLocation",
+  "spark.kubernetes.initcontainer.downloadFilesResourceIdentifier",
+  "spark.kubernetes.initcontainer.downloadFilesSecretLocation",
+  "spark.kubernetes.initcontainer.remoteJars",
+  "spark.kubernetes.initcontainer.remoteFiles",
+  "spark.kubernetes.mountdependencies.jarsDownloadDir",
+  "spark.kubernetes.mountdependencies.filesDownloadDir",
+  "spark.kubernetes.initcontainer.executor.stagingServerSecret.name",
+  "spark.kubernetes.initcontainer.executor.stagingServerSecret.mountDir",
+  "spark.kubernetes.executor.limit.cores",
   "spark.master",
   "spark.yarn.jars",
   "spark.yarn.keytab",
@@ -66,6 +81,7 @@ class Checkpoint(ssc: StreamingContext, val checkpointTime: 
Time)
 val newSparkConf = new SparkConf(loadDefaults = 
false).setAll(sparkConfPairs)
   .remove("spark.driver.host")
   .remove("spark.driver.bindAddress")
+  .remove("spark.kubernetes.driver.pod.name")
   .remove("spark.driver.port")
 val newReloadConf = new SparkConf(loadDefaults = true)
 propertiesToReload.foreach { prop =>


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23088][CORE] History server not showing incomplete/running applications

2018-01-29 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master f235df66a -> 31bd1dab1


[SPARK-23088][CORE] History server not showing incomplete/running applications

## What changes were proposed in this pull request?

History server not showing incomplete/running applications when 
spark.history.ui.maxApplications property is set to a value that is smaller 
than the total number of applications.

## How was this patch tested?

Verified manually against master and 2.2.2 branch.

Author: Paul Mackles 

Closes #20335 from pmackles/SPARK-23088.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/31bd1dab
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/31bd1dab
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/31bd1dab

Branch: refs/heads/master
Commit: 31bd1dab1301d27a16c9d5d1b0b3301d618b0516
Parents: f235df6
Author: Paul Mackles 
Authored: Tue Jan 30 11:15:27 2018 +0800
Committer: jerryshao 
Committed: Tue Jan 30 11:15:27 2018 +0800

--
 .../main/resources/org/apache/spark/ui/static/historypage.js  | 7 ++-
 1 file changed, 6 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/31bd1dab/core/src/main/resources/org/apache/spark/ui/static/historypage.js
--
diff --git a/core/src/main/resources/org/apache/spark/ui/static/historypage.js 
b/core/src/main/resources/org/apache/spark/ui/static/historypage.js
index 2cde66b..f0b2a5a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/historypage.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/historypage.js
@@ -108,7 +108,12 @@ $(document).ready(function() {
 requestedIncomplete = getParameterByName("showIncomplete", searchString);
 requestedIncomplete = (requestedIncomplete == "true" ? true : false);
 
-$.getJSON("api/v1/applications?limit=" + appLimit, 
function(response,status,jqXHR) {
+appParams = {
+  limit: appLimit,
+  status: (requestedIncomplete ? "running" : "completed")
+};
+
+$.getJSON("api/v1/applications", appParams, 
function(response,status,jqXHR) {
   var array = [];
   var hasMultipleAttempts = false;
   for (i in response) {


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23279][SS] Avoid triggering distributed job for Console sink

2018-01-30 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master ca04c3ff2 -> 8c6a9c90a


[SPARK-23279][SS] Avoid triggering distributed job for Console sink

## What changes were proposed in this pull request?

Console sink will redistribute collected local data and trigger a distributed 
job in each batch, this is not necessary, so here change to local job.

## How was this patch tested?

Existing UT and manual verification.

Author: jerryshao 

Closes #20447 from jerryshao/console-minor.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8c6a9c90
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8c6a9c90
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8c6a9c90

Branch: refs/heads/master
Commit: 8c6a9c90a36a938372f28ee8be72178192fbc313
Parents: ca04c3f
Author: jerryshao 
Authored: Wed Jan 31 13:59:21 2018 +0800
Committer: jerryshao 
Committed: Wed Jan 31 13:59:21 2018 +0800

--
 .../spark/sql/execution/streaming/sources/ConsoleWriter.scala| 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/8c6a9c90/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
--
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
index d46f4d7..c57bdc4 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
@@ -17,6 +17,8 @@
 
 package org.apache.spark.sql.execution.streaming.sources
 
+import scala.collection.JavaConverters._
+
 import org.apache.spark.internal.Logging
 import org.apache.spark.sql.{Row, SparkSession}
 import org.apache.spark.sql.sources.v2.DataSourceOptions
@@ -61,7 +63,7 @@ class ConsoleWriter(schema: StructType, options: 
DataSourceOptions)
 println("---")
 // scalastyle:off println
 spark
-  .createDataFrame(spark.sparkContext.parallelize(rows), schema)
+  .createDataFrame(rows.toList.asJava, schema)
   .show(numRowsToShow, isTruncated)
   }
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23279][SS] Avoid triggering distributed job for Console sink

2018-01-30 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/branch-2.3 b8778321b -> ab5a51055


[SPARK-23279][SS] Avoid triggering distributed job for Console sink

## What changes were proposed in this pull request?

Console sink will redistribute collected local data and trigger a distributed 
job in each batch, this is not necessary, so here change to local job.

## How was this patch tested?

Existing UT and manual verification.

Author: jerryshao 

Closes #20447 from jerryshao/console-minor.

(cherry picked from commit 8c6a9c90a36a938372f28ee8be72178192fbc313)
Signed-off-by: jerryshao 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/ab5a5105
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/ab5a5105
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/ab5a5105

Branch: refs/heads/branch-2.3
Commit: ab5a5105502c545bed951538f0ce9409cfbde154
Parents: b877832
Author: jerryshao 
Authored: Wed Jan 31 13:59:21 2018 +0800
Committer: jerryshao 
Committed: Wed Jan 31 13:59:36 2018 +0800

--
 .../spark/sql/execution/streaming/sources/ConsoleWriter.scala| 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/ab5a5105/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
--
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
index d46f4d7..c57bdc4 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
@@ -17,6 +17,8 @@
 
 package org.apache.spark.sql.execution.streaming.sources
 
+import scala.collection.JavaConverters._
+
 import org.apache.spark.internal.Logging
 import org.apache.spark.sql.{Row, SparkSession}
 import org.apache.spark.sql.sources.v2.DataSourceOptions
@@ -61,7 +63,7 @@ class ConsoleWriter(schema: StructType, options: 
DataSourceOptions)
 println("---")
 // scalastyle:off println
 spark
-  .createDataFrame(spark.sparkContext.parallelize(rows), schema)
+  .createDataFrame(rows.toList.asJava, schema)
   .show(numRowsToShow, isTruncated)
   }
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: Revert "[SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore"

2018-01-31 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master b6b50efc8 -> 4b7cd479a


Revert "[SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore"

This reverts commit d1721816d26bedee3c72eeb75db49da500568376.

The patch is not fully tested and out-of-date. So revert it.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4b7cd479
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/4b7cd479
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/4b7cd479

Branch: refs/heads/master
Commit: 4b7cd479a28b274f5a0802c9b017b3eb15002c21
Parents: b6b50ef
Author: jerryshao 
Authored: Thu Feb 1 13:58:13 2018 +0800
Committer: jerryshao 
Committed: Thu Feb 1 14:00:08 2018 +0800

--
 .../org/apache/spark/streaming/Checkpoint.scala | 16 
 1 file changed, 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/4b7cd479/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
--
diff --git 
a/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala 
b/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
index ed2a896..aed67a5 100644
--- a/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
+++ b/streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala
@@ -53,21 +53,6 @@ class Checkpoint(ssc: StreamingContext, val checkpointTime: 
Time)
   "spark.driver.host",
   "spark.driver.bindAddress",
   "spark.driver.port",
-  "spark.kubernetes.driver.pod.name",
-  "spark.kubernetes.executor.podNamePrefix",
-  "spark.kubernetes.initcontainer.executor.configmapname",
-  "spark.kubernetes.initcontainer.executor.configmapkey",
-  "spark.kubernetes.initcontainer.downloadJarsResourceIdentifier",
-  "spark.kubernetes.initcontainer.downloadJarsSecretLocation",
-  "spark.kubernetes.initcontainer.downloadFilesResourceIdentifier",
-  "spark.kubernetes.initcontainer.downloadFilesSecretLocation",
-  "spark.kubernetes.initcontainer.remoteJars",
-  "spark.kubernetes.initcontainer.remoteFiles",
-  "spark.kubernetes.mountdependencies.jarsDownloadDir",
-  "spark.kubernetes.mountdependencies.filesDownloadDir",
-  "spark.kubernetes.initcontainer.executor.stagingServerSecret.name",
-  "spark.kubernetes.initcontainer.executor.stagingServerSecret.mountDir",
-  "spark.kubernetes.executor.limit.cores",
   "spark.master",
   "spark.yarn.jars",
   "spark.yarn.keytab",
@@ -81,7 +66,6 @@ class Checkpoint(ssc: StreamingContext, val checkpointTime: 
Time)
 val newSparkConf = new SparkConf(loadDefaults = 
false).setAll(sparkConfPairs)
   .remove("spark.driver.host")
   .remove("spark.driver.bindAddress")
-  .remove("spark.kubernetes.driver.pod.name")
   .remove("spark.driver.port")
 val newReloadConf = new SparkConf(loadDefaults = true)
 propertiesToReload.foreach { prop =>


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [MINOR][YARN] Add disable yarn.nodemanager.vmem-check-enabled option to memLimitExceededLogMessage

2018-03-07 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 4c587eb48 -> 04e71c316


[MINOR][YARN] Add disable yarn.nodemanager.vmem-check-enabled option to 
memLimitExceededLogMessage

My spark application sometimes will throw `Container killed by YARN for 
exceeding memory limits`.
Even I increased `spark.yarn.executor.memoryOverhead` to 10G, this error still 
happen.  The latest config:
https://user-images.githubusercontent.com/5399861/36975716-f5c548d2-20b5-11e8-95e5-b228d50917b9.png";>

And error message:
```
ExecutorLostFailure (executor 121 exited caused by one of the running tasks) 
Reason: Container killed by YARN for exceeding memory limits. 30.7 GB of 30 GB 
physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
```

This is because of [Linux glibc >= 2.10 (RHEL 6) malloc may show excessive 
virtual memory 
usage](https://www.ibm.com/developerworks/community/blogs/kevgrig/entry/linux_glibc_2_10_rhel_6_malloc_may_show_excessive_virtual_memory_usage?lang=en).
 So disable `yarn.nodemanager.vmem-check-enabled` looks like a good option as 
[MapR mentioned 
](https://mapr.com/blog/best-practices-yarn-resource-management).

This PR add disable `yarn.nodemanager.vmem-check-enabled` option to 
memLimitExceededLogMessage.

More details:
https://issues.apache.org/jira/browse/YARN-4714
https://stackoverflow.com/a/31450291
https://stackoverflow.com/a/42091255

After this PR:
https://user-images.githubusercontent.com/5399861/36975949-c8e7bbbe-20b6-11e8-9513-9f903b868d8d.png";>

N/A

Author: Yuming Wang 
Author: Yuming Wang 

Closes #20735 from wangyum/YARN-4714.

Change-Id: Ie10836e2c07b6384d228c3f9e89f802823bd9f16


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/04e71c31
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/04e71c31
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/04e71c31

Branch: refs/heads/master
Commit: 04e71c31603af3a13bc13300df799f003fe185f7
Parents: 4c587eb
Author: Yuming Wang 
Authored: Wed Mar 7 17:01:29 2018 +0800
Committer: jerryshao 
Committed: Wed Mar 7 17:01:29 2018 +0800

--
 .../main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/04e71c31/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala
--
diff --git 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala
 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala
index 506adb3..a537243 100644
--- 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala
+++ 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala
@@ -736,7 +736,8 @@ private object YarnAllocator {
   def memLimitExceededLogMessage(diagnostics: String, pattern: Pattern): 
String = {
 val matcher = pattern.matcher(diagnostics)
 val diag = if (matcher.find()) " " + matcher.group() + "." else ""
-("Container killed by YARN for exceeding memory limits." + diag
-  + " Consider boosting spark.yarn.executor.memoryOverhead.")
+s"Container killed by YARN for exceeding memory limits. $diag " +
+  "Consider boosting spark.yarn.executor.memoryOverhead or " +
+  "disabling yarn.nodemanager.vmem-check-enabled because of YARN-4714."
   }
 }


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23644][CORE][UI] Use absolute path for REST call in SHS

2018-03-16 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master c2632edeb -> ca83526de


[SPARK-23644][CORE][UI] Use absolute path for REST call in SHS

## What changes were proposed in this pull request?

SHS is using a relative path for the REST API call to get the list of the 
application is a relative path call. In case of the SHS being consumed through 
a proxy, it can be an issue if the path doesn't end with a "/".

Therefore, we should use an absolute path for the REST call as it is done for 
all the other resources.

## How was this patch tested?

manual tests
Before the change:
![screen shot 2018-03-10 at 4 22 02 
pm](https://user-images.githubusercontent.com/8821783/37244190-8ccf9d40-2485-11e8-8fa9-345bc81472fc.png)

After the change:
![screen shot 2018-03-10 at 4 36 34 pm 
1](https://user-images.githubusercontent.com/8821783/37244201-a1922810-2485-11e8-8856-eeab2bf5e180.png)

Author: Marco Gaido 

Closes #20794 from mgaido91/SPARK-23644.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/ca83526d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/ca83526d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/ca83526d

Branch: refs/heads/master
Commit: ca83526de55f0f8784df58cc8b7c0a7cb0c96e23
Parents: c2632ed
Author: Marco Gaido 
Authored: Fri Mar 16 15:12:26 2018 +0800
Committer: jerryshao 
Committed: Fri Mar 16 15:12:26 2018 +0800

--
 .../src/main/resources/org/apache/spark/ui/static/historypage.js | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/ca83526d/core/src/main/resources/org/apache/spark/ui/static/historypage.js
--
diff --git a/core/src/main/resources/org/apache/spark/ui/static/historypage.js 
b/core/src/main/resources/org/apache/spark/ui/static/historypage.js
index f0b2a5a..abc2ec0 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/historypage.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/historypage.js
@@ -113,7 +113,7 @@ $(document).ready(function() {
   status: (requestedIncomplete ? "running" : "completed")
 };
 
-$.getJSON("api/v1/applications", appParams, 
function(response,status,jqXHR) {
+$.getJSON(uiRoot + "/api/v1/applications", appParams, 
function(response,status,jqXHR) {
   var array = [];
   var hasMultipleAttempts = false;
   for (i in response) {
@@ -151,7 +151,7 @@ $(document).ready(function() {
 "showCompletedColumns": !requestedIncomplete,
   }
 
-  $.get("static/historypage-template.html", function(template) {
+  $.get(uiRoot + "/static/historypage-template.html", function(template) {
 var sibling = historySummary.prev();
 historySummary.detach();
 var apps = 
$(Mustache.render($(template).filter("#history-summary-template").html(),data));


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23635][YARN] AM env variable should not overwrite same name env variable set through spark.executorEnv.

2018-03-16 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master ca83526de -> c95200048


[SPARK-23635][YARN] AM env variable should not overwrite same name env variable 
set through spark.executorEnv.

## What changes were proposed in this pull request?

In the current Spark on YARN code, AM always will copy and overwrite its env 
variables to executors, so we cannot set different values for executors.

To reproduce issue, user could start spark-shell like:

```
./bin/spark-shell --master yarn-client --conf 
spark.executorEnv.SPARK_ABC=executor_val --conf  
spark.yarn.appMasterEnv.SPARK_ABC=am_val
```

Then check executor env variables by

```
sc.parallelize(1 to 1).flatMap \{ i => sys.env.toSeq }.collect.foreach(println)
```

We will always get `am_val` instead of `executor_val`. So we should not let AM 
to overwrite specifically set executor env variables.

## How was this patch tested?

Added UT and tested in local cluster.

Author: jerryshao 

Closes #20799 from jerryshao/SPARK-23635.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c9520004
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c9520004
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c9520004

Branch: refs/heads/master
Commit: c952000487ee003200221b3c4e25dcb06e359f0a
Parents: ca83526
Author: jerryshao 
Authored: Fri Mar 16 16:22:03 2018 +0800
Committer: jerryshao 
Committed: Fri Mar 16 16:22:03 2018 +0800

--
 .../spark/deploy/yarn/ExecutorRunnable.scala| 22 +++-
 .../spark/deploy/yarn/YarnClusterSuite.scala| 36 
 2 files changed, 50 insertions(+), 8 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/c9520004/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnable.scala
--
diff --git 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnable.scala
 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnable.scala
index 3f4d236..ab08698 100644
--- 
a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnable.scala
+++ 
b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnable.scala
@@ -220,12 +220,6 @@ private[yarn] class ExecutorRunnable(
 val env = new HashMap[String, String]()
 Client.populateClasspath(null, conf, sparkConf, env, 
sparkConf.get(EXECUTOR_CLASS_PATH))
 
-sparkConf.getExecutorEnv.foreach { case (key, value) =>
-  // This assumes each executor environment variable set here is a path
-  // This is kept for backward compatibility and consistency with hadoop
-  YarnSparkHadoopUtil.addPathToEnvironment(env, key, value)
-}
-
 // lookup appropriate http scheme for container log urls
 val yarnHttpPolicy = conf.get(
   YarnConfiguration.YARN_HTTP_POLICY_KEY,
@@ -233,6 +227,20 @@ private[yarn] class ExecutorRunnable(
 )
 val httpScheme = if (yarnHttpPolicy == "HTTPS_ONLY") "https://"; else 
"http://";
 
+System.getenv().asScala.filterKeys(_.startsWith("SPARK"))
+  .foreach { case (k, v) => env(k) = v }
+
+sparkConf.getExecutorEnv.foreach { case (key, value) =>
+  if (key == Environment.CLASSPATH.name()) {
+// If the key of env variable is CLASSPATH, we assume it is a path and 
append it.
+// This is kept for backward compatibility and consistency with hadoop
+YarnSparkHadoopUtil.addPathToEnvironment(env, key, value)
+  } else {
+// For other env variables, simply overwrite the value.
+env(key) = value
+  }
+}
+
 // Add log urls
 container.foreach { c =>
   sys.env.get("SPARK_USER").foreach { user =>
@@ -245,8 +253,6 @@ private[yarn] class ExecutorRunnable(
   }
 }
 
-System.getenv().asScala.filterKeys(_.startsWith("SPARK"))
-  .foreach { case (k, v) => env(k) = v }
 env
   }
 }

http://git-wip-us.apache.org/repos/asf/spark/blob/c9520004/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
--
diff --git 
a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
 
b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
index 33d400a..a129be7 100644
--- 
a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
+++ 
b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
@@ -225,6 +225,14 @@ class YarnClusterSuite extends BaseYarnClusterSuite {
 finalState should be (SparkAppHandle.State.FAILED)
   }
 
+  test("executor env overwrite AM env in client mode") {
+testExecutorEnv(true)

spark git commit: [SPARK-23708][CORE] Correct comment for function addShutDownHook in ShutdownHookManager

2018-03-18 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 61487b308 -> 745c8c090


[SPARK-23708][CORE] Correct comment for function addShutDownHook in 
ShutdownHookManager

## What changes were proposed in this pull request?
Minor modification.Comment below is not right.
```
/**
   * Adds a shutdown hook with the given priority. Hooks with lower priority 
values run
   * first.
   *
   * param hook The code to run during shutdown.
   * return A handle that can be used to unregister the shutdown hook.
   */
  def addShutdownHook(priority: Int)(hook: () => Unit): AnyRef = {
shutdownHooks.add(priority, hook)
  }
```

## How was this patch tested?

UT

Author: zhoukang 

Closes #20845 from caneGuy/zhoukang/fix-shutdowncomment.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/745c8c09
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/745c8c09
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/745c8c09

Branch: refs/heads/master
Commit: 745c8c0901ac522ba92c1356ca74bd0dd7701496
Parents: 61487b3
Author: zhoukang 
Authored: Mon Mar 19 13:31:21 2018 +0800
Committer: jerryshao 
Committed: Mon Mar 19 13:31:21 2018 +0800

--
 .../src/main/scala/org/apache/spark/util/ShutdownHookManager.scala | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/745c8c09/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
--
diff --git 
a/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala 
b/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
index 4001fac..b702838 100644
--- a/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
+++ b/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
@@ -143,7 +143,7 @@ private[spark] object ShutdownHookManager extends Logging {
   }
 
   /**
-   * Adds a shutdown hook with the given priority. Hooks with lower priority 
values run
+   * Adds a shutdown hook with the given priority. Hooks with higher priority 
values run
* first.
*
* @param hook The code to run during shutdown.


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23644][CORE][UI][BACKPORT-2.3] Use absolute path for REST call in SHS

2018-03-19 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/branch-2.3 5c1c03d08 -> 2f82c037d


[SPARK-23644][CORE][UI][BACKPORT-2.3] Use absolute path for REST call in SHS

## What changes were proposed in this pull request?

SHS is using a relative path for the REST API call to get the list of the 
application is a relative path call. In case of the SHS being consumed through 
a proxy, it can be an issue if the path doesn't end with a "/".

Therefore, we should use an absolute path for the REST call as it is done for 
all the other resources.

## How was this patch tested?

manual tests
Before the change:
![screen shot 2018-03-10 at 4 22 02 
pm](https://user-images.githubusercontent.com/8821783/37244190-8ccf9d40-2485-11e8-8fa9-345bc81472fc.png)

After the change:
![screen shot 2018-03-10 at 4 36 34 pm 
1](https://user-images.githubusercontent.com/8821783/37244201-a1922810-2485-11e8-8856-eeab2bf5e180.png)

Author: Marco Gaido 

Closes #20847 from mgaido91/SPARK-23644_2.3.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/2f82c037
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/2f82c037
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/2f82c037

Branch: refs/heads/branch-2.3
Commit: 2f82c037d90114705c0d0bd0bd7f82215aecfe3b
Parents: 5c1c03d
Author: Marco Gaido 
Authored: Tue Mar 20 10:07:27 2018 +0800
Committer: jerryshao 
Committed: Tue Mar 20 10:07:27 2018 +0800

--
 .../src/main/resources/org/apache/spark/ui/static/historypage.js | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/2f82c037/core/src/main/resources/org/apache/spark/ui/static/historypage.js
--
diff --git a/core/src/main/resources/org/apache/spark/ui/static/historypage.js 
b/core/src/main/resources/org/apache/spark/ui/static/historypage.js
index 2cde66b..16d59be 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/historypage.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/historypage.js
@@ -108,7 +108,7 @@ $(document).ready(function() {
 requestedIncomplete = getParameterByName("showIncomplete", searchString);
 requestedIncomplete = (requestedIncomplete == "true" ? true : false);
 
-$.getJSON("api/v1/applications?limit=" + appLimit, 
function(response,status,jqXHR) {
+$.getJSON(uiRoot + "/api/v1/applications?limit=" + appLimit, 
function(response,status,jqXHR) {
   var array = [];
   var hasMultipleAttempts = false;
   for (i in response) {
@@ -146,7 +146,7 @@ $(document).ready(function() {
 "showCompletedColumns": !requestedIncomplete,
   }
 
-  $.get("static/historypage-template.html", function(template) {
+  $.get(uiRoot + "/static/historypage-template.html", function(template) {
 var sibling = historySummary.prev();
 historySummary.detach();
 var apps = 
$(Mustache.render($(template).filter("#history-summary-template").html(),data));


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-23361][YARN] Allow AM to restart after initial tokens expire.

2018-03-22 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master b2edc30db -> 5fa438471


[SPARK-23361][YARN] Allow AM to restart after initial tokens expire.

Currently, the Spark AM relies on the initial set of tokens created by
the submission client to be able to talk to HDFS and other services that
require delegation tokens. This means that after those tokens expire, a
new AM will fail to start (e.g. when there is an application failure and
re-attempts are enabled).

This PR makes it so that the first thing the AM does when the user provides
a principal and keytab is to create new delegation tokens for use. This
makes sure that the AM can be started irrespective of how old the original
token set is. It also allows all of the token management to be done by the
AM - there is no need for the submission client to set configuration values
to tell the AM when to renew tokens.

Note that even though in this case the AM will not be using the delegation
tokens created by the submission client, those tokens still need to be provided
to YARN, since they are used to do log aggregation.

To be able to re-use the code in the AMCredentialRenewal for the above
purposes, I refactored that class a bit so that it can fetch tokens into
a pre-defined UGI, insted of always logging in.

Another issue with re-attempts is that, after the fix that allows the AM
to restart correctly, new executors would get confused about when to
update credentials, because the credential updater used the update time
initially set up by the submission code. This could make the executor
fail to update credentials in time, since that value would be very out
of date in the situation described in the bug.

To fix that, I changed the YARN code to use the new RPC-based mechanism
for distributing tokens to executors. This allowed the old credential
updater code to be removed, and a lot of code in the renewer to be
simplified.

I also made two currently hardcoded values (the renewal time ratio, and
the retry wait) configurable; while this probably never needs to be set
by anyone in a production environment, it helps with testing; that's also
why they're not documented.

Tested on real cluster with a specially crafted application to test this
functionality: checked proper access to HDFS, Hive and HBase in cluster
mode with token renewal on and AM restarts. Tested things still work in
client mode too.

Author: Marcelo Vanzin 

Closes #20657 from vanzin/SPARK-23361.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5fa43847
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5fa43847
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5fa43847

Branch: refs/heads/master
Commit: 5fa438471110afbf4e2174df449ac79e292501f8
Parents: b2edc30
Author: Marcelo Vanzin 
Authored: Fri Mar 23 13:59:21 2018 +0800
Committer: jerryshao 
Committed: Fri Mar 23 13:59:21 2018 +0800

--
 .../main/scala/org/apache/spark/SparkConf.scala |  12 +-
 .../apache/spark/deploy/SparkHadoopUtil.scala   |  32 +-
 .../executor/CoarseGrainedExecutorBackend.scala |  12 -
 .../apache/spark/internal/config/package.scala  |  12 +
 .../MesosHadoopDelegationTokenManager.scala |  11 +-
 .../spark/deploy/yarn/ApplicationMaster.scala   | 117 +++-
 .../org/apache/spark/deploy/yarn/Client.scala   | 102 +++
 .../spark/deploy/yarn/YarnSparkHadoopUtil.scala |  20 --
 .../org/apache/spark/deploy/yarn/config.scala   |  25 --
 .../yarn/security/AMCredentialRenewer.scala | 291 ---
 .../yarn/security/CredentialUpdater.scala   | 131 -
 .../YARNHadoopDelegationTokenManager.scala  |   9 +-
 .../cluster/YarnClientSchedulerBackend.scala|   9 +-
 .../cluster/YarnSchedulerBackend.scala  |  10 +-
 .../YARNHadoopDelegationTokenManagerSuite.scala |   7 +-
 .../org/apache/spark/streaming/Checkpoint.scala |   3 -
 16 files changed, 238 insertions(+), 565 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/5fa43847/core/src/main/scala/org/apache/spark/SparkConf.scala
--
diff --git a/core/src/main/scala/org/apache/spark/SparkConf.scala 
b/core/src/main/scala/org/apache/spark/SparkConf.scala
index f53b2be..129956e 100644
--- a/core/src/main/scala/org/apache/spark/SparkConf.scala
+++ b/core/src/main/scala/org/apache/spark/SparkConf.scala
@@ -603,13 +603,15 @@ private[spark] object SparkConf extends Logging {
 "Please use spark.kryoserializer.buffer instead. The default value for 
" +
   "spark.kryoserializer.buffer.mb was previously specified as '0.064'. 
Fractional values " +
   "are no longer accepted. To specify the equivalent now, one may use 
'64k'."),
-  DeprecatedConfig("spark.rpc", "2.0", "Not used any more."),
+  DeprecatedConfig("spark.rpc",

spark git commit: [SPARK-23787][TESTS] Fix file download test in SparkSubmitSuite for Hadoop 2.9.

2018-03-25 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 087fb3142 -> eb48edf9c


[SPARK-23787][TESTS] Fix file download test in SparkSubmitSuite for Hadoop 2.9.

This particular test assumed that Hadoop libraries did not support
http as a file system. Hadoop 2.9 does, so the test failed. The test
now forces a non-existent implementation for the http fs, which
forces the expected error.

There were also a couple of other issues in the same test: SparkSubmit
arguments in the wrong order, and the wrong check later when asserting,
which was being masked by the previous issues.

Author: Marcelo Vanzin 

Closes #20895 from vanzin/SPARK-23787.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/eb48edf9
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/eb48edf9
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/eb48edf9

Branch: refs/heads/master
Commit: eb48edf9ca4f4b42c63f145718696472cb6a31ba
Parents: 087fb31
Author: Marcelo Vanzin 
Authored: Mon Mar 26 14:01:04 2018 +0800
Committer: jerryshao 
Committed: Mon Mar 26 14:01:04 2018 +0800

--
 .../apache/spark/deploy/SparkSubmitSuite.scala  | 36 +++-
 1 file changed, 19 insertions(+), 17 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/eb48edf9/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
--
diff --git a/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala 
b/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
index 2d0c192..d86ef90 100644
--- a/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
+++ b/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
@@ -959,25 +959,28 @@ class SparkSubmitSuite
   }
 
   test("download remote resource if it is not supported by yarn service") {
-testRemoteResources(isHttpSchemeBlacklisted = false, supportMockHttpFs = 
false)
+testRemoteResources(enableHttpFs = false, blacklistHttpFs = false)
   }
 
   test("avoid downloading remote resource if it is supported by yarn service") 
{
-testRemoteResources(isHttpSchemeBlacklisted = false, supportMockHttpFs = 
true)
+testRemoteResources(enableHttpFs = true, blacklistHttpFs = false)
   }
 
   test("force download from blacklisted schemes") {
-testRemoteResources(isHttpSchemeBlacklisted = true, supportMockHttpFs = 
true)
+testRemoteResources(enableHttpFs = true, blacklistHttpFs = true)
   }
 
-  private def testRemoteResources(isHttpSchemeBlacklisted: Boolean,
-  supportMockHttpFs: Boolean): Unit = {
+  private def testRemoteResources(
+  enableHttpFs: Boolean,
+  blacklistHttpFs: Boolean): Unit = {
 val hadoopConf = new Configuration()
 updateConfWithFakeS3Fs(hadoopConf)
-if (supportMockHttpFs) {
+if (enableHttpFs) {
   hadoopConf.set("fs.http.impl", classOf[TestFileSystem].getCanonicalName)
-  hadoopConf.set("fs.http.impl.disable.cache", "true")
+} else {
+  hadoopConf.set("fs.http.impl", getClass().getName() + ".DoesNotExist")
 }
+hadoopConf.set("fs.http.impl.disable.cache", "true")
 
 val tmpDir = Utils.createTempDir()
 val mainResource = File.createTempFile("tmpPy", ".py", tmpDir)
@@ -986,20 +989,19 @@ class SparkSubmitSuite
 val tmpHttpJar = TestUtils.createJarWithFiles(Map("test.resource" -> 
"USER"), tmpDir)
 val tmpHttpJarPath = s"http://${new 
File(tmpHttpJar.toURI).getAbsolutePath}"
 
+val forceDownloadArgs = if (blacklistHttpFs) {
+  Seq("--conf", "spark.yarn.dist.forceDownloadSchemes=http")
+} else {
+  Nil
+}
+
 val args = Seq(
   "--class", UserClasspathFirstTest.getClass.getName.stripPrefix("$"),
   "--name", "testApp",
   "--master", "yarn",
   "--deploy-mode", "client",
-  "--jars", s"$tmpS3JarPath,$tmpHttpJarPath",
-  s"s3a://$mainResource"
-) ++ (
-  if (isHttpSchemeBlacklisted) {
-Seq("--conf", "spark.yarn.dist.forceDownloadSchemes=http,https")
-  } else {
-Nil
-  }
-)
+  "--jars", s"$tmpS3JarPath,$tmpHttpJarPath"
+) ++ forceDownloadArgs ++ Seq(s"s3a://$mainResource")
 
 val appArgs = new SparkSubmitArguments(args)
 val (_, _, conf, _) = SparkSubmit.prepareSubmitEnvironment(appArgs, 
Some(hadoopConf))
@@ -1009,7 +1011,7 @@ class SparkSubmitSuite
 // The URI of remote S3 resource should still be remote.
 assert(jars.contains(tmpS3JarPath))
 
-if (supportMockHttpFs) {
+if (enableHttpFs && !blacklistHttpFs) {
   // If Http FS is supported by yarn service, the URI of remote http 
resource should
   // still be remote.
   assert(jars.contains(tmpHttpJarPath))


-
To unsubscribe, e-mail: commits-un

spark git commit: [SPARK-23743][SQL] Changed a comparison logic from containing 'slf4j' to starting with 'org.slf4j'

2018-03-29 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master b34890119 -> df05fb63a


[SPARK-23743][SQL] Changed a comparison logic from containing 'slf4j' to 
starting with 'org.slf4j'

## What changes were proposed in this pull request?
isSharedClass returns if some classes can/should be shared or not. It checks if 
the classes names have some keywords or start with some names. Following the 
logic, it can occur unintended behaviors when a custom package has `slf4j` 
inside the package or class name. As I guess, the first intention seems to 
figure out the class containing `org.slf4j`. It would be better to change the 
comparison logic to `name.startsWith("org.slf4j")`

## How was this patch tested?
This patch should pass all of the current tests and keep all of the current 
behaviors. In my case, I'm using ProtobufDeserializer to get a table schema 
from hive tables. Thus some Protobuf packages and names have `slf4j` inside. 
Without this patch, it cannot be resolved because of ClassCastException from 
different classloaders.

Author: Jongyoul Lee 

Closes #20860 from jongyoul/SPARK-23743.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/df05fb63
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/df05fb63
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/df05fb63

Branch: refs/heads/master
Commit: df05fb63abe6018ccbe572c34cf65fc3ecbf1166
Parents: b348901
Author: Jongyoul Lee 
Authored: Fri Mar 30 14:07:35 2018 +0800
Committer: jerryshao 
Committed: Fri Mar 30 14:07:35 2018 +0800

--
 .../org/apache/spark/sql/hive/client/IsolatedClientLoader.scala | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/df05fb63/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
--
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
index 12975bc..c2690ec 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
@@ -179,8 +179,9 @@ private[hive] class IsolatedClientLoader(
 val isHadoopClass =
   name.startsWith("org.apache.hadoop.") && 
!name.startsWith("org.apache.hadoop.hive.")
 
-name.contains("slf4j") ||
-name.contains("log4j") ||
+name.startsWith("org.slf4j") ||
+name.startsWith("org.apache.log4j") || // log4j1.x
+name.startsWith("org.apache.logging.log4j") || // log4j2
 name.startsWith("org.apache.spark.") ||
 (sharesHadoopClasses && isHadoopClass) ||
 name.startsWith("scala.") ||


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-22290][CORE] Avoid creating Hive delegation tokens when not necessary.

2017-10-18 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 6f1d0dea1 -> dc2714da5


[SPARK-22290][CORE] Avoid creating Hive delegation tokens when not necessary.

Hive delegation tokens are only needed when the Spark driver has no access
to the kerberos TGT. That happens only in two situations:

- when using a proxy user
- when using cluster mode without a keytab

This change modifies the Hive provider so that it only generates delegation
tokens in those situations, and tweaks the YARN AM so that it makes the proper
user visible to the Hive code when running with keytabs, so that the TGT
can be used instead of a delegation token.

The effect of this change is that now it's possible to initialize multiple,
non-concurrent SparkContext instances in the same JVM. Before, the second
invocation would fail to fetch a new Hive delegation token, which then could
make the second (or third or...) application fail once the token expired.
With this change, the TGT will be used to authenticate to the HMS instead.

This change also avoids polluting the current logged in user's credentials
when launching applications. The credentials are copied only when running
applications as a proxy user. This makes it possible to implement SPARK-11035
later, where multiple threads might be launching applications, and each app
should have its own set of credentials.

Tested by verifying HDFS and Hive access in following scenarios:
- client and cluster mode
- client and cluster mode with proxy user
- client and cluster mode with principal / keytab
- long-running cluster app with principal / keytab
- pyspark app that creates (and stops) multiple SparkContext instances
  through its lifetime

Author: Marcelo Vanzin 

Closes #19509 from vanzin/SPARK-22290.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/dc2714da
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/dc2714da
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/dc2714da

Branch: refs/heads/master
Commit: dc2714da50ecba1bf1fdf555a82a4314f763a76e
Parents: 6f1d0de
Author: Marcelo Vanzin 
Authored: Thu Oct 19 14:56:48 2017 +0800
Committer: jerryshao 
Committed: Thu Oct 19 14:56:48 2017 +0800

--
 .../apache/spark/deploy/SparkHadoopUtil.scala   | 17 +++--
 .../security/HBaseDelegationTokenProvider.scala |  4 +-
 .../security/HadoopDelegationTokenManager.scala |  2 +-
 .../HadoopDelegationTokenProvider.scala |  2 +-
 .../HadoopFSDelegationTokenProvider.scala   |  4 +-
 .../security/HiveDelegationTokenProvider.scala  | 20 +-
 docs/running-on-yarn.md |  9 +++
 .../spark/deploy/yarn/ApplicationMaster.scala   | 69 
 .../org/apache/spark/deploy/yarn/Client.scala   |  5 +-
 .../org/apache/spark/deploy/yarn/config.scala   |  4 ++
 .../spark/sql/hive/client/HiveClientImpl.scala  |  6 --
 11 files changed, 110 insertions(+), 32 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/dc2714da/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala
--
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala 
b/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala
index 53775db..1fa10ab 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala
@@ -61,13 +61,17 @@ class SparkHadoopUtil extends Logging {
* do a FileSystem.closeAllForUGI in order to avoid leaking Filesystems
*/
   def runAsSparkUser(func: () => Unit) {
+createSparkUser().doAs(new PrivilegedExceptionAction[Unit] {
+  def run: Unit = func()
+})
+  }
+
+  def createSparkUser(): UserGroupInformation = {
 val user = Utils.getCurrentUserName()
-logDebug("running as user: " + user)
+logDebug("creating UGI for user: " + user)
 val ugi = UserGroupInformation.createRemoteUser(user)
 transferCredentials(UserGroupInformation.getCurrentUser(), ugi)
-ugi.doAs(new PrivilegedExceptionAction[Unit] {
-  def run: Unit = func()
-})
+ugi
   }
 
   def transferCredentials(source: UserGroupInformation, dest: 
UserGroupInformation) {
@@ -417,6 +421,11 @@ class SparkHadoopUtil extends Logging {
 creds.readTokenStorageStream(new DataInputStream(tokensBuf))
 creds
   }
+
+  def isProxyUser(ugi: UserGroupInformation): Boolean = {
+ugi.getAuthenticationMethod() == 
UserGroupInformation.AuthenticationMethod.PROXY
+  }
+
 }
 
 object SparkHadoopUtil {

http://git-wip-us.apache.org/repos/asf/spark/blob/dc2714da/core/src/main/scala/org/apache/spark/deploy/security/HBaseDelegationTokenProvider.scala
--
diff --git 
a/core/src/main/scala/org/apache/spa

spark git commit: [SPARK-22319][CORE] call loginUserFromKeytab before accessing hdfs

2017-10-22 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master ca2a780e7 -> 57accf6e3


[SPARK-22319][CORE] call loginUserFromKeytab before accessing hdfs

In `SparkSubmit`, call `loginUserFromKeytab` before attempting to make RPC 
calls to the NameNode.

I manually tested this patch by:

1. Confirming that my Spark application failed to launch with the error 
reported in https://issues.apache.org/jira/browse/SPARK-22319.
2. Applying this patch and confirming that the app no longer fails to launch, 
even when I have not manually run `kinit` on the host.

Presumably we also want integration tests for secure clusters so that we catch 
this sort of thing. I'm happy to take a shot at this if it's feasible and 
someone can point me in the right direction.

Author: Steven Rand 

Closes #19540 from sjrand/SPARK-22319.

Change-Id: Ic306bfe7181107fbcf92f61d75856afcb5b6f761


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/57accf6e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/57accf6e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/57accf6e

Branch: refs/heads/master
Commit: 57accf6e3965ff69adc4408623916c5003918235
Parents: ca2a780
Author: Steven Rand 
Authored: Mon Oct 23 09:43:45 2017 +0800
Committer: jerryshao 
Committed: Mon Oct 23 09:43:45 2017 +0800

--
 .../org/apache/spark/deploy/SparkSubmit.scala   | 32 ++--
 1 file changed, 16 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/57accf6e/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
--
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala 
b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
index 135bbe9..b7e6d0e 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
@@ -342,6 +342,22 @@ object SparkSubmit extends CommandLineUtils with Logging {
 val hadoopConf = 
conf.getOrElse(SparkHadoopUtil.newConfiguration(sparkConf))
 val targetDir = Utils.createTempDir()
 
+// assure a keytab is available from any place in a JVM
+if (clusterManager == YARN || clusterManager == LOCAL || clusterManager == 
MESOS) {
+  if (args.principal != null) {
+if (args.keytab != null) {
+  require(new File(args.keytab).exists(), s"Keytab file: 
${args.keytab} does not exist")
+  // Add keytab and principal configurations in sysProps to make them 
available
+  // for later use; e.g. in spark sql, the isolated class loader used 
to talk
+  // to HiveMetastore will use these settings. They will be set as 
Java system
+  // properties and then loaded by SparkConf
+  sysProps.put("spark.yarn.keytab", args.keytab)
+  sysProps.put("spark.yarn.principal", args.principal)
+  UserGroupInformation.loginUserFromKeytab(args.principal, args.keytab)
+}
+  }
+}
+
 // Resolve glob path for different resources.
 args.jars = Option(args.jars).map(resolveGlobPaths(_, hadoopConf)).orNull
 args.files = Option(args.files).map(resolveGlobPaths(_, hadoopConf)).orNull
@@ -641,22 +657,6 @@ object SparkSubmit extends CommandLineUtils with Logging {
   }
 }
 
-// assure a keytab is available from any place in a JVM
-if (clusterManager == YARN || clusterManager == LOCAL || clusterManager == 
MESOS) {
-  if (args.principal != null) {
-if (args.keytab != null) {
-  require(new File(args.keytab).exists(), s"Keytab file: 
${args.keytab} does not exist")
-  // Add keytab and principal configurations in sysProps to make them 
available
-  // for later use; e.g. in spark sql, the isolated class loader used 
to talk
-  // to HiveMetastore will use these settings. They will be set as 
Java system
-  // properties and then loaded by SparkConf
-  sysProps.put("spark.yarn.keytab", args.keytab)
-  sysProps.put("spark.yarn.principal", args.principal)
-  UserGroupInformation.loginUserFromKeytab(args.principal, args.keytab)
-}
-  }
-}
-
 if (clusterManager == MESOS && UserGroupInformation.isSecurityEnabled) {
   setRMPrincipal(sysProps)
 }


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-22319][CORE][BACKPORT-2.2] call loginUserFromKeytab before accessing hdfs

2017-10-22 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/branch-2.2 f8c83fdc5 -> bf8163f5b


[SPARK-22319][CORE][BACKPORT-2.2] call loginUserFromKeytab before accessing hdfs

In SparkSubmit, call loginUserFromKeytab before attempting to make RPC calls to 
the NameNode.

Same as #https://github.com/apache/spark/pull/19540, but for branch-2.2.

Manually tested for master as described in 
https://github.com/apache/spark/pull/19540.

Author: Steven Rand 

Closes #19554 from sjrand/SPARK-22319-branch-2.2.

Change-Id: Ic550a818fd6a3f38b356ac48029942d463738458


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/bf8163f5
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/bf8163f5
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/bf8163f5

Branch: refs/heads/branch-2.2
Commit: bf8163f5be55a94e02849ccbaf755702a2c6c68f
Parents: f8c83fd
Author: Steven Rand 
Authored: Mon Oct 23 14:26:03 2017 +0800
Committer: jerryshao 
Committed: Mon Oct 23 14:26:03 2017 +0800

--
 .../org/apache/spark/deploy/SparkSubmit.scala   | 38 ++--
 1 file changed, 19 insertions(+), 19 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/bf8163f5/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
--
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala 
b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
index 86d578e..4f2f2c1 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
@@ -316,6 +316,25 @@ object SparkSubmit extends CommandLineUtils {
   RPackageUtils.checkAndBuildRPackage(args.jars, printStream, args.verbose)
 }
 
+// assure a keytab is available from any place in a JVM
+if (clusterManager == YARN || clusterManager == LOCAL) {
+  if (args.principal != null) {
+require(args.keytab != null, "Keytab must be specified when principal 
is specified")
+if (!new File(args.keytab).exists()) {
+  throw new SparkException(s"Keytab file: ${args.keytab} does not 
exist")
+} else {
+  // Add keytab and principal configurations in sysProps to make them 
available
+  // for later use; e.g. in spark sql, the isolated class loader used 
to talk
+  // to HiveMetastore will use these settings. They will be set as 
Java system
+  // properties and then loaded by SparkConf
+  sysProps.put("spark.yarn.keytab", args.keytab)
+  sysProps.put("spark.yarn.principal", args.principal)
+
+  UserGroupInformation.loginUserFromKeytab(args.principal, args.keytab)
+}
+  }
+}
+
 // In client mode, download remote files.
 var localPrimaryResource: String = null
 var localJars: String = null
@@ -582,25 +601,6 @@ object SparkSubmit extends CommandLineUtils {
   }
 }
 
-// assure a keytab is available from any place in a JVM
-if (clusterManager == YARN || clusterManager == LOCAL) {
-  if (args.principal != null) {
-require(args.keytab != null, "Keytab must be specified when principal 
is specified")
-if (!new File(args.keytab).exists()) {
-  throw new SparkException(s"Keytab file: ${args.keytab} does not 
exist")
-} else {
-  // Add keytab and principal configurations in sysProps to make them 
available
-  // for later use; e.g. in spark sql, the isolated class loader used 
to talk
-  // to HiveMetastore will use these settings. They will be set as 
Java system
-  // properties and then loaded by SparkConf
-  sysProps.put("spark.yarn.keytab", args.keytab)
-  sysProps.put("spark.yarn.principal", args.principal)
-
-  UserGroupInformation.loginUserFromKeytab(args.principal, args.keytab)
-}
-  }
-}
-
 // In yarn-cluster mode, use yarn.Client as a wrapper around the user class
 if (isYarnCluster) {
   childMainClass = "org.apache.spark.deploy.yarn.Client"


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-21840][CORE] Add trait that allows conf to be directly set in application.

2017-10-26 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 592cfeab9 -> 3073344a2


[SPARK-21840][CORE] Add trait that allows conf to be directly set in 
application.

Currently SparkSubmit uses system properties to propagate configuration to
applications. This makes it hard to implement features such as SPARK-11035,
which would allow multiple applications to be started in the same JVM. The
current code would cause the config data from multiple apps to get mixed
up.

This change introduces a new trait, currently internal to Spark, that allows
the app configuration to be passed directly to the application, without
having to use system properties. The current "call main() method" behavior
is maintained as an implementation of this new trait. This will be useful
to allow multiple cluster mode apps to be submitted from the same JVM.

As part of this, SparkSubmit was modified to collect all configuration
directly into a SparkConf instance. Most of the changes are to tests so
they use SparkConf instead of an opaque map.

Tested with existing and added unit tests.

Author: Marcelo Vanzin 

Closes #19519 from vanzin/SPARK-21840.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/3073344a
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/3073344a
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/3073344a

Branch: refs/heads/master
Commit: 3073344a2551fb198d63f2114a519ab97904cb55
Parents: 592cfea
Author: Marcelo Vanzin 
Authored: Thu Oct 26 15:50:27 2017 +0800
Committer: jerryshao 
Committed: Thu Oct 26 15:50:27 2017 +0800

--
 .../apache/spark/deploy/SparkApplication.scala  |  55 +
 .../org/apache/spark/deploy/SparkSubmit.scala   | 160 +++---
 .../apache/spark/deploy/SparkSubmitSuite.scala  | 213 +++
 .../deploy/rest/StandaloneRestSubmitSuite.scala |   4 +-
 4 files changed, 257 insertions(+), 175 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/3073344a/core/src/main/scala/org/apache/spark/deploy/SparkApplication.scala
--
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkApplication.scala 
b/core/src/main/scala/org/apache/spark/deploy/SparkApplication.scala
new file mode 100644
index 000..118b460
--- /dev/null
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkApplication.scala
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.deploy
+
+import java.lang.reflect.Modifier
+
+import org.apache.spark.SparkConf
+
+/**
+ * Entry point for a Spark application. Implementations must provide a 
no-argument constructor.
+ */
+private[spark] trait SparkApplication {
+
+  def start(args: Array[String], conf: SparkConf): Unit
+
+}
+
+/**
+ * Implementation of SparkApplication that wraps a standard Java class with a 
"main" method.
+ *
+ * Configuration is propagated to the application via system properties, so 
running multiple
+ * of these in the same JVM may lead to undefined behavior due to 
configuration leaks.
+ */
+private[deploy] class JavaMainApplication(klass: Class[_]) extends 
SparkApplication {
+
+  override def start(args: Array[String], conf: SparkConf): Unit = {
+val mainMethod = klass.getMethod("main", new Array[String](0).getClass)
+if (!Modifier.isStatic(mainMethod.getModifiers)) {
+  throw new IllegalStateException("The main method in the given main class 
must be static")
+}
+
+val sysProps = conf.getAll.toMap
+sysProps.foreach { case (k, v) =>
+  sys.props(k) = v
+}
+
+mainMethod.invoke(null, args)
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/spark/blob/3073344a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
--
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala 
b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
index b7e6d0e..73b956e 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
+

spark git commit: [SPARK-22172][CORE] Worker hangs when the external shuffle service port is already in use

2017-11-01 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/master 556b5d215 -> 96798d14f


[SPARK-22172][CORE] Worker hangs when the external shuffle service port is 
already in use

## What changes were proposed in this pull request?

Handling the NonFatal exceptions while starting the external shuffle service, 
if there are any NonFatal exceptions it logs and continues without the external 
shuffle service.

## How was this patch tested?

I verified it manually, it logs the exception and continues to serve without 
external shuffle service when BindException occurs.

Author: Devaraj K 

Closes #19396 from devaraj-kavali/SPARK-22172.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/96798d14
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/96798d14
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/96798d14

Branch: refs/heads/master
Commit: 96798d14f07208796fa0a90af0ab369879bacd6c
Parents: 556b5d2
Author: Devaraj K 
Authored: Wed Nov 1 18:07:39 2017 +0800
Committer: jerryshao 
Committed: Wed Nov 1 18:07:39 2017 +0800

--
 .../scala/org/apache/spark/deploy/worker/Worker.scala   | 12 +++-
 1 file changed, 11 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/96798d14/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
--
diff --git a/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala 
b/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
index ed5fa4b..3962d42 100755
--- a/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
@@ -199,7 +199,7 @@ private[deploy] class Worker(
 logInfo(s"Running Spark version ${org.apache.spark.SPARK_VERSION}")
 logInfo("Spark home: " + sparkHome)
 createWorkDir()
-shuffleService.startIfEnabled()
+startExternalShuffleService()
 webUi = new WorkerWebUI(this, workDir, webUiPort)
 webUi.bind()
 
@@ -367,6 +367,16 @@ private[deploy] class Worker(
 }
   }
 
+  private def startExternalShuffleService() {
+try {
+  shuffleService.startIfEnabled()
+} catch {
+  case e: Exception =>
+logError("Failed to start external shuffle service", e)
+System.exit(1)
+}
+  }
+
   private def sendRegisterMessageToMaster(masterEndpoint: RpcEndpointRef): 
Unit = {
 masterEndpoint.send(RegisterWorker(
   workerId,


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] Git Push Summary

2018-09-15 Thread jshao
Repository: spark
Updated Tags:  refs/tags/v2.3.2-rc6 [created] 02b510728

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

2018-09-15 Thread jshao
Preparing development version 2.3.3-SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7b5da37c
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7b5da37c
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7b5da37c

Branch: refs/heads/branch-2.3
Commit: 7b5da37c0ad08e7b2f3d536de13be63758a2ed99
Parents: 02b5107
Author: Saisai Shao 
Authored: Sun Sep 16 11:31:22 2018 +0800
Committer: Saisai Shao 
Committed: Sun Sep 16 11:31:22 2018 +0800

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/7b5da37c/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 8df2635..6ec4966 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.2
+Version: 2.3.3
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/7b5da37c/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 57485fc..f8b15cc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b5da37c/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 53e58c2..e412a47 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b5da37c/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d05647c..d8f9a3d 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b5da37c/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index 8d46761..a1a4f87 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3

[1/2] spark git commit: Preparing Spark release v2.3.2-rc6

2018-09-15 Thread jshao
Repository: spark
Updated Branches:
  refs/heads/branch-2.3 0c1e3d109 -> 7b5da37c0


Preparing Spark release v2.3.2-rc6


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/02b51072
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/02b51072
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/02b51072

Branch: refs/heads/branch-2.3
Commit: 02b510728c31b70e6035ad541bfcdc2b59dcd79a
Parents: 0c1e3d1
Author: Saisai Shao 
Authored: Sun Sep 16 11:31:17 2018 +0800
Committer: Saisai Shao 
Committed: Sun Sep 16 11:31:17 2018 +0800

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/02b51072/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 6ec4966..8df2635 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.3
+Version: 2.3.2
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/02b51072/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index f8b15cc..57485fc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/02b51072/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index e412a47..53e58c2 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/02b51072/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d8f9a3d..d05647c 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/02b51072/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index a1a4f87..8d46761 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml

svn commit: r29421 - /dev/spark/v2.3.2-rc6-bin/

2018-09-16 Thread jshao
Author: jshao
Date: Sun Sep 16 13:30:43 2018
New Revision: 29421

Log:
Apache Spark v2.3.2-rc6

Added:
dev/spark/v2.3.2-rc6-bin/
dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz   (with props)
dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512
dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz   (with props)
dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.6.tgz   (with props)
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.6.tgz.asc
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.6.tgz.sha512
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.7.tgz   (with props)
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.7.tgz.asc
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-hadoop2.7.tgz.sha512
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-without-hadoop.tgz   (with props)
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-without-hadoop.tgz.asc
dev/spark/v2.3.2-rc6-bin/spark-2.3.2-bin-without-hadoop.tgz.sha512
dev/spark/v2.3.2-rc6-bin/spark-2.3.2.tgz   (with props)
dev/spark/v2.3.2-rc6-bin/spark-2.3.2.tgz.asc
dev/spark/v2.3.2-rc6-bin/spark-2.3.2.tgz.sha512

Added: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc
==
--- dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc (added)
+++ dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.asc Sun Sep 16 13:30:43 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIcBAABCgAGBQJbnkkHAAoJENsLIaASlz/QGnMP/jKJ2zrZbpjd/ladRk5c6i3h
+DalLIk6+ZnSimBUvH+aZFqM2Xam41KlKJkgrXUS4wVOoHfcu0HxkwkpqhC0E/cTY
+KUTZ2Y2rFm7IVFUtwfwlqdR77v/4MEE0tMkOAxy8ZAumyKV5AAG+1OQ0k+X4q+E9
+Q6E8WicEhzr6Pi+9bOSJuHZE0LP1Vpou7Q9JhRQQC/cT1VbZu7+AeJ3RoiQLV6gp
+uigSK73pMDIPlaHpqyTJAvy9VVyF7DseACTDOGon/FOXMNXg2UZcQ00cViJ5Ykxd
+i/jFrFa3X79hedlLfC9RMI191G5DzePtnh+grqQxk80EK3xizx+Y1ptir7RRuO9V
+KWslgAI7cLxpJ6v8tvpWzqfheUD0HGoZ8JhSXsG02X0/v4ZNIIrzGF8eEKZvc5AW
+NTAHD7ws9myeghp4pcOiZuw64obBG7QIkMHe9a62ZdyfqZjkdpA2BiEhqFi0dI89
+lLf2bjmoz97Y5YuFrjix6XP4057xGUSFGnZuOWsfvjtg6dbTEYaIZxLqcplu6esD
+gBLk4Ct0pXH7wcv4aWEtby20Wq6YGR7GKCIpEnOtXIPkKdPi4iuCIyWZy9WXjwZY
+wJ4z2locysS5bgDahsdNSLQEN9UbxkPqi7GIpGPVvNrR97HXcumOOsmQeaWS2Xx4
+YsZoVDmqlgBu/oyW5Bw1
+=OlcR
+-END PGP SIGNATURE-

Added: dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512
==
--- dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512 (added)
+++ dev/spark/v2.3.2-rc6-bin/SparkR_2.3.2.tar.gz.sha512 Sun Sep 16 13:30:43 2018
@@ -0,0 +1,3 @@
+SparkR_2.3.2.tar.gz: BE4B6B28 DC3CB5FC 947E7B21 79AED9DD 55573A05 D0DEBB53
+ 86864B05 C02F32B4 FB997E7A 9643BA61 6BC495E1 A2FE03D9
+ AE2D2DC2 4D43A48C 39498238 7259F58D

Added: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc
==
--- dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc (added)
+++ dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.asc Sun Sep 16 13:30:43 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIcBAABCgAGBQJbnkXVAAoJENsLIaASlz/QaAsP/3MdsTgoz9cfqQTleKT2Kw6M
+P6rzaiFoTq9tlWlBoeWSmqR42TilQzPGSLerzSGNMuEIpdpzENc/aopqd/1vU2qf
+ghmmfGtyCn1Mj2wLHRAIEseaXCViZPOmiH6YpmcUziY7aybNtB0g9aZt/9M9N2ts
+BnCU06zk0esBYkZmnw4f/WYG32v7WQN7Lb/IewgoguhpGKRa0ypad56r24y2Qf0N
+Us1GUfQzu5XXTr+CJI9zukJudLCNnOdIlnUoSv25pePxWodNRw+49ixG+qQvxkvt
+WGsb/lWJh3tTvPeZFJcB5Yg2lU5YWKck0a6WNhIRSlbJgzizhEyQs9YrF3HBtlgC
+bAT6GEjcnwCXxdgUZKUnd0P3POK85Dd1XFxVj+yWwIjKBvdFlqlE50eAgPuKZMZ+
+aptQ3+XPakoukKFA07moywE38yQZrYpULGLn5V4W04PS1g/3DOm0pAvshJuA58Sf
+z76gMJGthcYgL2RmXGJslMyZetUVVjZkvm5GVAIJtxJlGA1vtsEVYUJQyW1M8Vh3
+lCiUBSpyZL/6XHLSObPWLX4NuagjaC0vSUMbfZJYOYMh8SGltWCWJt2/2SdzueJY
+4RdOfmkYmXub9NVn/MgAYCGoq+kx0NGNoG8fF2+x6xnm81pYKJTecQjVrfZUgSkC
+/oriBynvPpnJ0lBRRyw8
+=F1pu
+-END PGP SIGNATURE-

Added: dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512
==
--- dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512 (added)
+++ dev/spark/v2.3.2-rc6-bin/pyspark-2.3.2.tar.gz.sha512 Sun Sep 16 13:30:43 
2018
@@ -0,0 +1,3

svn commit: r29438 - in /dev/spark/v2.3.2-rc6-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2018-09-17 Thread jshao
Author: jshao
Date: Mon Sep 17 12:13:30 2018
New Revision: 29438

Log:
Apache Spark v2.3.2-rc6 docs


[This commit notification would consist of 1447 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] Git Push Summary

2018-09-24 Thread jshao
Repository: spark
Updated Tags:  refs/tags/v2.3.2 [created] 02b510728

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[10/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-25 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaPairRDD.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaPairRDD.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaPairRDD.html
new file mode 100644
index 000..726bcd5
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaPairRDD.html
@@ -0,0 +1,4020 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaPairRDD (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":9,"i1":10,"i2":10,"i3":10,"i4":10,"i5":9,"i6":9,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10,"i17":10,"i18":10,"i19":9,"i20":10,"i21":9,"i22":9,"i23":10,"i24":10,"i25":10,"i26":10,"i27":9,"i28":9,"i29":9,"i30":9,"i31":9,"i32":10,"i33":10,"i34":10,"i35":9,"i36":10,"i37":10,"i38":10,"i39":9,"i40":9,"i41":9,"i42":10,"i43":10,"i44":10,"i45":10,"i46":9,"i47":9,"i48":9,"i49":10,"i50":9,"i51":10,"i52":10,"i53":10,"i54":9,"i55":9,"i56":9,"i57":9,"i58":9,"i59":9,"i60":10,"i61":10,"i62":10,"i63":9,"i64":9,"i65":9,"i66":9,"i67":9,"i68":9,"i69":10,"i70":10,"i71":10,"i72":10,"i73":10,"i74":10,"i75":9,"i76":10,"i77":9,"i78":9,"i79":9,"i80":10,"i81":10,"i82":10,"i83":10,"i84":9,"i85":10,"i86":10,"i87":10,"i88":10,"i89":10,"i90":9,"i91":9,"i92":9,"i93":9,"i94":9,"i95":9,"i96":9,"i97":9,"i98":9,"i99":9,"i100":9,"i101":10,"i102":9,"i103":9,"i104":9,"i105":10,"i106":9,"i107":9,"i108":10,"i109":9,"i110":9,"i111":9,"i112":9,"i113":9,"i114":10
 
,"i115":9,"i116":10,"i117":10,"i118":10,"i119":10,"i120":10,"i121":10,"i122":10,"i123":10,"i124":10,"i125":10,"i126":10,"i127":10,"i128":10,"i129":10,"i130":10,"i131":10,"i132":10,"i133":10,"i134":10,"i135":10,"i136":10,"i137":10,"i138":10,"i139":9,"i140":9,"i141":9,"i142":10,"i143":10,"i144":10,"i145":10,"i146":10,"i147":10,"i148":10,"i149":10,"i150":10,"i151":10,"i152":10,"i153":10,"i154":10,"i155":9,"i156":9,"i157":9,"i158":9,"i159":9,"i160":9,"i161":9,"i162":9,"i163":9,"i164":9,"i165":9,"i166":9,"i167":9,"i168":9,"i169":9,"i170":10,"i171":10,"i172":10,"i173":10,"i174":10,"i175":10,"i176":9,"i177":9,"i178":9,"i179":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Class JavaPairRDD
+
+
+
+Object
+
+
+org.apache.spark.api.java.JavaPairRDD
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, JavaRDDLike,JavaPairRDD>
+
+
+Direct Known Subclasses:
+JavaHadoopRDD, JavaNewHadoopRDD
+
+
+
+public class JavaPairRDD
+extends Object
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+JavaPairRDD(RDD> rdd,
+   scala.reflect.ClassTag kClassTag,
+   scala.reflect.ClassTag vClassTag) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+static  U
+aggregate(U zeroValue,
+ Function2 seqOp,
+ Function2 combOp) 
+
+
+ JavaPairRDD
+aggregateByKey(U zeroValue,
+  Function2 seqFunc,
+  Function2 combFunc)
+Aggregate the values of each key, using given combine 
functions and a neutral "zero value".
+
+
+
+ JavaPairRDD
+aggregateByKey(U zeroValue,
+  int numPartitions,
+  Function2 seqFunc,
+  Function2 combFunc)
+Aggregate the values of each key, using given combine 
functions and a neutral "zero value".
+
+
+
+ JavaPairRDD
+aggregateByKey(U zeroValue,
+  Partitioner partitioner,
+  Function2 seqFunc,
+  Function2 combFunc)
+Aggregate the values of each key, using given combine 
functions and a neutral "zero value".
+
+
+
+JavaPairRDD
+cache()
+Persist this RDD with the default storage level 
(MEMORY_ONLY).
+
+
+
+static  Ja

[44/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/dropTempView.html
--
diff --git a/site/docs/2.3.2/api/R/dropTempView.html 
b/site/docs/2.3.2/api/R/dropTempView.html
new file mode 100644
index 000..5ee2883
--- /dev/null
+++ b/site/docs/2.3.2/api/R/dropTempView.html
@@ -0,0 +1,63 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Drops the temporary view 
with the given view name in the...
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+dropTempView {SparkR}R 
Documentation
+
+Drops the temporary view with the given view name in the catalog.
+
+Description
+
+Drops the temporary view with the given view name in the catalog.
+If the view has been cached before, then it will also be uncached.
+
+
+
+Usage
+
+
+dropTempView(viewName)
+
+
+
+Arguments
+
+
+viewName
+
+the name of the temporary view to be dropped.
+
+
+
+
+Value
+
+TRUE if the view is dropped successfully, FALSE otherwise.
+
+
+
+Note
+
+since 2.0.0
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D df <- read.df(path, "parquet")
+##D createOrReplaceTempView(df, "table")
+##D dropTempView("table")
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/dtypes.html
--
diff --git a/site/docs/2.3.2/api/R/dtypes.html 
b/site/docs/2.3.2/api/R/dtypes.html
new file mode 100644
index 000..b19e0fd
--- /dev/null
+++ b/site/docs/2.3.2/api/R/dtypes.html
@@ -0,0 +1,106 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: DataTypes
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+dtypes 
{SparkR}R Documentation
+
+DataTypes
+
+Description
+
+Return all column names and their data types as a list
+
+
+
+Usage
+
+
+dtypes(x)
+
+## S4 method for signature 'SparkDataFrame'
+dtypes(x)
+
+
+
+Arguments
+
+
+x
+
+A SparkDataFrame
+
+
+
+
+Note
+
+dtypes since 1.4.0
+
+
+
+See Also
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+attach,SparkDataFrame-method,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+except, explain,
+filter, first,
+gapplyCollect, gapply,
+getNumPartitions, group_by,
+head, hint,
+histogram, insertInto,
+intersect, isLocal,
+isStreaming, join,
+limit, localCheckpoint,
+merge, mutate,
+ncol, nrow,
+persist, printSchema,
+randomSplit, rbind,
+registerTempTable, rename,
+repartition, rollup,
+sample, saveAsTable,
+schema, selectExpr,
+select, showDF,
+show, storageLevel,
+str, subset,
+summary, take,
+toJSON, unionByName,
+union, unpersist,
+withColumn, withWatermark,
+with, write.df,
+write.jdbc, write.json,
+write.orc, write.parquet,
+write.stream, write.text
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D path <- "path/to/file.json"
+##D df <- read.json(path)
+##D dtypes(df)
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/endsWith.html
--
diff --git a/site/docs/2.3.2/api/R/endsWith.html 
b/site/docs/2.3.2/api/R/endsWith.html
new file mode 100644
index 000..24bea1f
--- /dev/null
+++ b/site/docs/2.3.2/api/R/endsWith.html
@@ -0,0 +1,56 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: endsWith
+
+
+
+
+endsWith 
{SparkR}R Documentation
+
+endsWith
+
+Description
+
+Determines if entries of x end with string (entries of) suffix respectively,
+where strings are recycled to common lengths.
+
+
+
+Usage
+
+
+endsWith(x, suffix)
+
+## S4 method for signature 'Column'
+endsWith(x, suffix)
+
+
+
+Arguments
+
+
+x
+
+vector of character string whose "ends" are considered
+
+suffix
+
+character vector (often of length one)
+
+
+
+
+Note
+
+endsWith since 1.4.0
+
+
+
+See Also
+
+Other colum_func: alias,
+between, cast,
+otherwise, over,
+startsWith, substr
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/eq_null_safe.html
--
diff --git a/site/docs/2.3.2/

[41/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/rbind.html
--
diff --git a/site/docs/2.3.2/api/R/rbind.html b/site/docs/2.3.2/api/R/rbind.html
new file mode 100644
index 000..890ab98
--- /dev/null
+++ b/site/docs/2.3.2/api/R/rbind.html
@@ -0,0 +1,128 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Union two or more 
SparkDataFrames
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+rbind 
{SparkR}R Documentation
+
+Union two or more SparkDataFrames
+
+Description
+
+Union two or more SparkDataFrames by row. As in R's rbind, 
this method
+requires that the input SparkDataFrames have the same column names.
+
+
+
+Usage
+
+
+rbind(..., deparse.level = 1)
+
+## S4 method for signature 'SparkDataFrame'
+rbind(x, ..., deparse.level = 1)
+
+
+
+Arguments
+
+
+...
+
+additional SparkDataFrame(s).
+
+deparse.level
+
+currently not used (put here to match the signature of
+the base implementation).
+
+x
+
+a SparkDataFrame.
+
+
+
+
+Details
+
+Note: This does not remove duplicate rows across the two SparkDataFrames.
+
+
+
+Value
+
+A SparkDataFrame containing the result of the union.
+
+
+
+Note
+
+rbind since 1.5.0
+
+
+
+See Also
+
+union unionByName
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+attach,SparkDataFrame-method,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+dtypes, except,
+explain, filter,
+first, gapplyCollect,
+gapply, getNumPartitions,
+group_by, head,
+hint, histogram,
+insertInto, intersect,
+isLocal, isStreaming,
+join, limit,
+localCheckpoint, merge,
+mutate, ncol,
+nrow, persist,
+printSchema, randomSplit,
+registerTempTable, rename,
+repartition, rollup,
+sample, saveAsTable,
+schema, selectExpr,
+select, showDF,
+show, storageLevel,
+str, subset,
+summary, take,
+toJSON, unionByName,
+union, unpersist,
+withColumn, withWatermark,
+with, write.df,
+write.jdbc, write.json,
+write.orc, write.parquet,
+write.stream, write.text
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D unions <- rbind(df, df2, df3, df4)
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/read.df.html
--
diff --git a/site/docs/2.3.2/api/R/read.df.html 
b/site/docs/2.3.2/api/R/read.df.html
new file mode 100644
index 000..2bd9c43
--- /dev/null
+++ b/site/docs/2.3.2/api/R/read.df.html
@@ -0,0 +1,106 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Load a 
SparkDataFrame
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+read.df 
{SparkR}R Documentation
+
+Load a SparkDataFrame
+
+Description
+
+Returns the dataset in a data source as a SparkDataFrame
+
+
+
+Usage
+
+
+## Default S3 method:
+read.df(path = NULL, source = NULL, schema = NULL,
+  na.strings = "NA", ...)
+
+## Default S3 method:
+loadDF(path = NULL, source = NULL, schema = NULL,
+  ...)
+
+
+
+Arguments
+
+
+path
+
+The path of files to load
+
+source
+
+The name of external data source
+
+schema
+
+The data schema defined in structType or a DDL-formatted string.
+
+na.strings
+
+Default string value for NA when source is "csv"
+
+...
+
+additional external data source specific named properties.
+
+
+
+
+Details
+
+The data source is specified by the source and a set of 
options(...).
+If source is not specified, the default data source configured by
+"spark.sql.sources.default" will be used. 
+Similar to R read.csv, when source is "csv", by 
default, a value of "NA" will be
+interpreted as NA.
+
+
+
+Value
+
+SparkDataFrame
+
+
+
+Note
+
+read.df since 1.4.0
+
+loadDF since 1.6.0
+
+
+
+See Also
+
+read.json
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D df1 <- read.df("path/to/file.json", source = "json")
+##D schema <- structType(structField("name", "string"),
+##D  structField("info", 
"map"))
+##D df2 <- read.df(mapTypeJsonPath, "json", schema, multiLine = 
TRUE)
+##D df3 <- loadDF("data/test_table", "parquet", 
mergeSchema = "true")
+##D stringSchema <- "name STRING, info MAP"
+##D df4 <- read.df(mapTypeJsonPath, "json", stringSchema, 
multiLine = TRUE)
+#

[09/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDD.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDD.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDD.html
new file mode 100644
index 000..6901011
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDD.html
@@ -0,0 +1,1957 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaRDD (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":9,"i1":10,"i2":9,"i3":9,"i4":10,"i5":10,"i6":10,"i7":9,"i8":9,"i9":9,"i10":9,"i11":9,"i12":9,"i13":9,"i14":9,"i15":9,"i16":9,"i17":9,"i18":9,"i19":10,"i20":10,"i21":10,"i22":9,"i23":9,"i24":9,"i25":9,"i26":9,"i27":9,"i28":9,"i29":9,"i30":9,"i31":9,"i32":9,"i33":9,"i34":9,"i35":9,"i36":9,"i37":9,"i38":9,"i39":10,"i40":9,"i41":9,"i42":9,"i43":9,"i44":9,"i45":9,"i46":9,"i47":9,"i48":9,"i49":9,"i50":9,"i51":9,"i52":9,"i53":9,"i54":9,"i55":9,"i56":9,"i57":9,"i58":9,"i59":9,"i60":10,"i61":9,"i62":9,"i63":9,"i64":9,"i65":9,"i66":10,"i67":10,"i68":10,"i69":9,"i70":10,"i71":10,"i72":10,"i73":9,"i74":9,"i75":9,"i76":10,"i77":10,"i78":10,"i79":10,"i80":10,"i81":9,"i82":9,"i83":9,"i84":9,"i85":9,"i86":9,"i87":9,"i88":9,"i89":9,"i90":9,"i91":9,"i92":10,"i93":9,"i94":9,"i95":9,"i96":9,"i97":10,"i98":10,"i99":10,"i100":10,"i101":9,"i102":9,"i103":9,"i104":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Class JavaRDD
+
+
+
+Object
+
+
+org.apache.spark.api.java.JavaRDD
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, JavaRDDLike>
+
+
+
+public class JavaRDD
+extends Object
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+JavaRDD(RDD rdd,
+   scala.reflect.ClassTag classTag) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+static  U
+aggregate(U zeroValue,
+ Function2 seqOp,
+ Function2 combOp) 
+
+
+JavaRDD
+cache()
+Persist this RDD with the default storage level 
(MEMORY_ONLY).
+
+
+
+static  JavaPairRDD
+cartesian(JavaRDDLike other) 
+
+
+static void
+checkpoint() 
+
+
+scala.reflect.ClassTag
+classTag() 
+
+
+JavaRDD
+coalesce(int numPartitions)
+Return a new RDD that is reduced into 
numPartitions partitions.
+
+
+
+JavaRDD
+coalesce(int numPartitions,
+boolean shuffle)
+Return a new RDD that is reduced into 
numPartitions partitions.
+
+
+
+static java.util.List
+collect() 
+
+
+static JavaFutureAction>
+collectAsync() 
+
+
+static java.util.List[]
+collectPartitions(int[] partitionIds) 
+
+
+static SparkContext
+context() 
+
+
+static long
+count() 
+
+
+static PartialResult
+countApprox(long timeout) 
+
+
+static PartialResult
+countApprox(long timeout,
+   double confidence) 
+
+
+static long
+countApproxDistinct(double relativeSD) 
+
+
+static JavaFutureAction
+countAsync() 
+
+
+static java.util.Map
+countByValue() 
+
+
+static PartialResult>
+countByValueApprox(long timeout) 
+
+
+static PartialResult>
+countByValueApprox(long timeout,
+  double confidence) 
+
+
+JavaRDD
+distinct()
+Return a new RDD containing the distinct elements in this 
RDD.
+
+
+
+JavaRDD
+distinct(int numPartitions)
+Return a new RDD containing the distinct elements in this 
RDD.
+
+
+
+JavaRDD
+filter(Function f)
+Return a new RDD containing only the elements that satisfy 
a predicate.
+
+
+
+static T
+first() 
+
+
+static  JavaRDD
+flatMap(FlatMapFunction f) 
+
+
+static JavaDoubleRDD
+flatMapToDouble(DoubleFlatMapFunction f) 
+
+
+static  JavaPairRDD
+flatMapToPair(PairFlatMapFunction

[47/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/attach.html
--
diff --git a/site/docs/2.3.2/api/R/attach.html 
b/site/docs/2.3.2/api/R/attach.html
new file mode 100644
index 000..3d0058b
--- /dev/null
+++ b/site/docs/2.3.2/api/R/attach.html
@@ -0,0 +1,122 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Attach SparkDataFrame to R 
search path
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+attach,SparkDataFrame-method {SparkR}R Documentation
+
+Attach SparkDataFrame to R search path
+
+Description
+
+The specified SparkDataFrame is attached to the R search path. This means 
that
+the SparkDataFrame is searched by R when evaluating a variable, so columns in
+the SparkDataFrame can be accessed by simply giving their names.
+
+
+
+Usage
+
+
+## S4 method for signature 'SparkDataFrame'
+attach(what, pos = 2L,
+  name = deparse(substitute(what), backtick = FALSE),
+  warn.conflicts = TRUE)
+
+
+
+Arguments
+
+
+what
+
+(SparkDataFrame) The SparkDataFrame to attach
+
+pos
+
+(integer) Specify position in search() where to attach.
+
+name
+
+(character) Name to use for the attached SparkDataFrame. Names
+starting with package: are reserved for library.
+
+warn.conflicts
+
+(logical) If TRUE, warnings are printed about conflicts
+from attaching the database, unless that SparkDataFrame contains an object
+
+
+
+
+Note
+
+attach since 1.6.0
+
+
+
+See Also
+
+detach
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+dtypes, except,
+explain, filter,
+first, gapplyCollect,
+gapply, getNumPartitions,
+group_by, head,
+hint, histogram,
+insertInto, intersect,
+isLocal, isStreaming,
+join, limit,
+localCheckpoint, merge,
+mutate, ncol,
+nrow, persist,
+printSchema, randomSplit,
+rbind, registerTempTable,
+rename, repartition,
+rollup, sample,
+saveAsTable, schema,
+selectExpr, select,
+showDF, show,
+storageLevel, str,
+subset, summary,
+take, toJSON,
+unionByName, union,
+unpersist, withColumn,
+withWatermark, with,
+write.df, write.jdbc,
+write.json, write.orc,
+write.parquet, write.stream,
+write.text
+
+
+
+Examples
+
+## Not run: 
+##D attach(irisDf)
+##D summary(Sepal_Width)
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/avg.html
--
diff --git a/site/docs/2.3.2/api/R/avg.html b/site/docs/2.3.2/api/R/avg.html
new file mode 100644
index 000..1306740
--- /dev/null
+++ b/site/docs/2.3.2/api/R/avg.html
@@ -0,0 +1,67 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: avg
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+avg 
{SparkR}R Documentation
+
+avg
+
+Description
+
+Aggregate function: returns the average of the values in a group.
+
+
+
+Usage
+
+
+avg(x, ...)
+
+## S4 method for signature 'Column'
+avg(x)
+
+
+
+Arguments
+
+
+x
+
+Column to compute on or a GroupedData object.
+
+...
+
+additional argument(s) when x is a GroupedData object.
+
+
+
+
+Note
+
+avg since 1.4.0
+
+
+
+See Also
+
+Other aggregate functions: column_aggregate_functions,
+corr, count,
+cov, first,
+last
+
+
+
+Examples
+
+## Not run: avg(df$c)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/awaitTermination.html
--
diff --git a/site/docs/2.3.2/api/R/awaitTermination.html 
b/site/docs/2.3.2/api/R/awaitTermination.html
new file mode 100644
index 000..b8a65a2
--- /dev/null
+++ b/site/docs/2.3.2/api/R/awaitTermination.html
@@ -0,0 +1,84 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: awaitTermination
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+awaitTermination {SparkR}R Documentation
+
+awaitTermination
+
+Description
+
+Waits 

[42/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/match.html
--
diff --git a/site/docs/2.3.2/api/R/match.html b/site/docs/2.3.2/api/R/match.html
new file mode 100644
index 000..d405b90
--- /dev/null
+++ b/site/docs/2.3.2/api/R/match.html
@@ -0,0 +1,65 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Match a column with given 
values.
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+%in% 
{SparkR}R Documentation
+
+Match a column with given values.
+
+Description
+
+Match a column with given values.
+
+
+
+Usage
+
+
+## S4 method for signature 'Column'
+x %in% table
+
+
+
+Arguments
+
+
+x
+
+a Column.
+
+table
+
+a collection of values (coercible to list) to compare with.
+
+
+
+
+Value
+
+A matched values as a result of comparing with given values.
+
+
+
+Note
+
+%in% since 1.5.0
+
+
+
+Examples
+
+## Not run: 
+##D filter(df, "age in (10, 30)")
+##D where(df, df$age %in% c(10, 30))
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/merge.html
--
diff --git a/site/docs/2.3.2/api/R/merge.html b/site/docs/2.3.2/api/R/merge.html
new file mode 100644
index 000..3eb2a86
--- /dev/null
+++ b/site/docs/2.3.2/api/R/merge.html
@@ -0,0 +1,177 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Merges two data 
frames
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+merge 
{SparkR}R Documentation
+
+Merges two data frames
+
+Description
+
+Merges two data frames
+
+
+
+Usage
+
+
+merge(x, y, ...)
+
+## S4 method for signature 'SparkDataFrame,SparkDataFrame'
+merge(x, y,
+  by = intersect(names(x), names(y)), by.x = by, by.y = by,
+  all = FALSE, all.x = all, all.y = all, sort = TRUE,
+  suffixes = c("_x", "_y"), ...)
+
+
+
+Arguments
+
+
+x
+
+the first data frame to be joined.
+
+y
+
+the second data frame to be joined.
+
+...
+
+additional argument(s) passed to the method.
+
+by
+
+a character vector specifying the join columns. If by is not
+specified, the common column names in x and y will 
be used.
+If by or both by.x and by.y are explicitly set to NULL or of length 0, the 
Cartesian
+Product of x and y will be returned.
+
+by.x
+
+a character vector specifying the joining columns for x.
+
+by.y
+
+a character vector specifying the joining columns for y.
+
+all
+
+a boolean value setting all.x and all.y
+if any of them are unset.
+
+all.x
+
+a boolean value indicating whether all the rows in x should
+be including in the join.
+
+all.y
+
+a boolean value indicating whether all the rows in y should
+be including in the join.
+
+sort
+
+a logical argument indicating whether the resulting columns should be 
sorted.
+
+suffixes
+
+a string vector of length 2 used to make colnames of
+x and y unique.
+The first element is appended to each colname of x.
+The second element is appended to each colname of y.
+
+
+
+
+Details
+
+If all.x and all.y are set to FALSE, a natural join will be returned. If
+all.x is set to TRUE and all.y is set to FALSE, a left outer join will
+be returned. If all.x is set to FALSE and all.y is set to TRUE, a right
+outer join will be returned. If all.x and all.y are set to TRUE, a full
+outer join will be returned.
+
+
+
+Note
+
+merge since 1.5.0
+
+
+
+See Also
+
+join crossJoin
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+attach,SparkDataFrame-method,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+dtypes, except,
+explain, filter,
+first, gapplyCollect,
+gapply, getNumPartitions,
+group_by, head,
+hint, histogram,
+insertInto, intersect,
+isLocal, isStreaming,
+join, limit,
+localCheckpoint, mutate,
+ncol, nrow,
+persist, printSchema,
+randomSplit, rbind,
+registerTempTable, rename,
+repartition, rollup,
+sample, saveAsTable,
+schema, selectExpr,
+select, showDF,
+show, storageLevel,
+str, subset,
+summary, take,
+toJSON, unionByName,
+union, unpersist,
+withColumn, withWatermark,
+with, write.df,
+write.jdbc, write.json,
+write.orc, write.parquet,
+write.stream, write.text
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D df1 <- read.json(path)
+##D 

[20/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html 
b/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html
new file mode 100644
index 000..075265c
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html
@@ -0,0 +1,517 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+SimpleFutureAction (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class 
SimpleFutureAction
+
+
+
+Object
+
+
+org.apache.spark.SimpleFutureAction
+
+
+
+
+
+
+
+All Implemented Interfaces:
+FutureAction, 
scala.concurrent.Awaitable, scala.concurrent.Future
+
+
+
+public class SimpleFutureAction
+extends Object
+implements FutureAction
+A FutureAction holding the 
result of an action that triggers a single job. Examples include
+ count, collect, reduce.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interface scala.concurrent.Future
+scala.concurrent.Future.InternalCallbackExecutor$
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+void
+cancel()
+Cancels the execution of this action.
+
+
+
+boolean
+isCancelled()
+Returns whether the action has been cancelled.
+
+
+
+boolean
+isCompleted()
+Returns whether the action has already been completed with 
a value or an exception.
+
+
+
+scala.collection.Seq
+jobIds()
+Returns the job IDs run by the underlying async 
operation.
+
+
+
+ void
+onComplete(scala.Function1,U> func,
+  scala.concurrent.ExecutionContext executor)
+When this action is completed, either through an exception, 
or a value, applies the provided
+ function.
+
+
+
+SimpleFutureAction
+ready(scala.concurrent.duration.Duration atMost,
+ scala.concurrent.CanAwait permit)
+Blocks until this action completes.
+
+
+
+T
+result(scala.concurrent.duration.Duration atMost,
+  scala.concurrent.CanAwait permit)
+Awaits and returns the result (of type T) of this 
action.
+
+
+
+ scala.concurrent.Future
+transform(scala.Function1,scala.util.Try> f,
+ scala.concurrent.ExecutionContext e) 
+
+
+ scala.concurrent.Future
+transformWith(scala.Function1,scala.concurrent.Future> f,
+ scala.concurrent.ExecutionContext e) 
+
+
+scala.Option>
+value()
+The value of this Future.
+
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface org.apache.spark.FutureAction
+get
+
+
+
+
+
+Methods inherited from interface scala.concurrent.Future
+andThen, collect, failed, fallbackTo, filter, flatMap, foreach, map, 
mapTo, onFailure, onSuccess, recover, recoverWith, transform, withFilter, 
zip
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+cancel
+public void cancel()
+Description copied from 
interface: FutureAction
+Cancels the execution of this action.
+
+Specified by:
+cancel in
 interface FutureAction
+
+
+
+
+
+
+
+
+ready
+public SimpleFutureAction ready(scala.concurrent.duration.Duration atMost,
+   scala.concurrent.CanAwait permit)
+Description copied from 
interface: FutureAction
+Blocks until this action completes.
+ 
+
+Specified by:
+ready in
 interface FutureAction
+Specified by:
+ready in 
interface scala.concurrent.Awaitable
+Parameters:
+atMost - maximum wait time, which may be negative (no waiting 
is done), Duration.Inf
+   for unbounded waiting, or a finite positive duration
+permit - (undocumented)
+Returns:
+this FutureAction
+
+
+
+
+
+
+
+

[01/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

Repository: spark-website
Updated Branches:
  refs/heads/asf-site 806a1bd52 -> 04a27dbf1


http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/r/PairwiseRRDD.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/api/r/PairwiseRRDD.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/r/PairwiseRRDD.html
new file mode 100644
index 000..fb6f330
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/r/PairwiseRRDD.html
@@ -0,0 +1,322 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+PairwiseRRDD (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.r
+Class PairwiseRRDD
+
+
+
+Object
+
+
+org.apache.spark.rdd.RDD
+
+
+org.apache.spark.api.r.BaseRRDD>
+
+
+org.apache.spark.api.r.PairwiseRRDD
+
+
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, Logging
+
+
+
+public class PairwiseRRDD
+extends BaseRRDD>
+Form an RDD[(Int, Array[Byte])] from key-value pairs 
returned from R.
+ This is used by SparkR's shuffle operations.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+PairwiseRRDD(RDD parent,
+int numPartitions,
+byte[] hashFunc,
+String deserializer,
+byte[] packageNames,
+Object[] broadcastVars,
+scala.reflect.ClassTag evidence$3) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+JavaPairRDD
+asJavaPairRDD() 
+
+
+
+
+
+
+Methods inherited from class org.apache.spark.api.r.BaseRRDD
+compute,
 getPartitions
+
+
+
+
+
+Methods inherited from class org.apache.spark.rdd.RDD
+aggregate,
 cache, cartesian,
 checkpoint,
 coalesce,
 collect, 
collect,
 context, 
count, countAppro
 x, countApproxDistinct,
 countApproxDistinct,
 countByValue,
 countByValueApprox,
 dependencies,
 distinct, distinct,
 doubleRDDToDoubleRDDFunctions,
 filter,
 first, flatMap,
 fold,
 foreach,
 foreachPartition,
 getCheckpointFile,
 getNumPartitions,
 getStorageLevel,
 glom, groupBy,
 groupBy, groupBy,
 id, intersection,
 intersection,
 intersection,
 isCheckpointed,
 isEmpty, 
iterator,
 keyBy, localCheckpoint,
 map,
 mapPartitions,
 mapPartitionsWithIndex,
 max,
 min,
 name, numericRDDToDoubleRDDFunctions,
 partitioner, partitions,
 persist, 
persist,
 pipe,
 pipe,
 pipe,
 preferredLocations,
 randomSplit,
 rddToAsyncRDDActions,
 rddToOrderedRDDFunctions,
 rddToPairRDDFunctions,
 rddToSequenceFileRDDFunctions,
 reduce,
 repartition,
 sample,
 saveAsObjectFile,
 saveAsTextFile,
 saveAsTextFile,
 setName,
 sortBy,
 sparkContext,
 subtract,
 subtract,
 subtract, take, takeOrdered,
 takeSample,
 toDebugString,
 toJavaRDD, 
toLocalIterator,
 top,
 toString, treeAggregate,
 treeRedu
 ce, union,
 unpersist,
 zip,
 zipPartitions,
 zipPartitions,
 zipPartitions,
 zipPartitions,
 zipPartitions,
 zipPartitions,
 zipWithIndex,
 zipWithUniqueId
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface org.apache.spark.internal.Logging
+initializeLogging,
 initializeLogIfNecessary,
 initializeLogIfNecessary,
 isTraceEnabled,
 log_, 
log, 
logDebug,
 logDebug,
 logError,
 logError,
 logInfo,
 logInfo,
 logName,
 logTrace,
 logTrace,
 logWarning,
 logWarning
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+PairwiseRRDD
+public PairwiseRRDD(RDD parent,
+int numPartitions,
+byte[] hashFunc,
+String deserializer,
+byte[] packageNames,
+Object[] broadcastVars,
+scala.reflect.ClassTag evidence$3)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+asJava

[04/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachFunction.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachFunction.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachFunction.html
new file mode 100644
index 000..89baac3
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachFunction.html
@@ -0,0 +1,239 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+ForeachFunction (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java.function
+Interface 
ForeachFunction
+
+
+
+
+
+
+All Superinterfaces:
+java.io.Serializable
+
+
+Functional Interface:
+This is a functional interface and can therefore be used as the assignment 
target for a lambda expression or method reference.
+
+
+
+@FunctionalInterface
+public interface ForeachFunction
+extends java.io.Serializable
+Base interface for a function used in Dataset's foreach 
function.
+
+ Spark will invoke the call function on each element in the input 
Dataset.
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Abstract Methods 
+
+Modifier and Type
+Method and Description
+
+
+void
+call(T t) 
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+
+
+call
+void call(T t)
+   throws Exception
+
+Throws:
+Exception
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachPartitionFunction.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachPartitionFunction.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachPartitionFunction.html
new file mode 100644
index 000..09d4281
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachPartitionFunction.html
@@ -0,0 +1,235 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+ForeachPartitionFunction (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java.function
+Interface 
ForeachPartitionFunction
+
+
+
+
+
+
+All Superinterfaces:
+java.io.Serializable
+
+
+Functional Interface:
+This is a functional interface and can therefore be used as the assignment 
target for a lambda expression or method re

[06/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkStatusTracker.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkStatusTracker.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkStatusTracker.html
new file mode 100644
index 000..1a9cef2
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkStatusTracker.html
@@ -0,0 +1,354 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaSparkStatusTracker (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10,"i4":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Class 
JavaSparkStatusTracker
+
+
+
+Object
+
+
+org.apache.spark.api.java.JavaSparkStatusTracker
+
+
+
+
+
+
+
+
+public class JavaSparkStatusTracker
+extends Object
+Low-level status reporting APIs for monitoring job and 
stage progress.
+ 
+ These APIs intentionally provide very weak consistency semantics; consumers 
of these APIs should
+ be prepared to handle empty / missing information.  For example, a job's 
stage ids may be known
+ but the status API may not have any information about the details of those 
stages, so
+ getStageInfo could potentially return null for a 
valid stage id.
+ 
+ To limit memory usage, these APIs only provide information on recent jobs / 
stages.  These APIs
+ will provide information for the last spark.ui.retainedStages 
stages and
+ spark.ui.retainedJobs jobs.
+ 
+
+Note:
+This class's constructor should be considered private and may be subject 
to change.
+
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+int[]
+getActiveJobIds()
+Returns an array containing the ids of all active 
jobs.
+
+
+
+int[]
+getActiveStageIds()
+Returns an array containing the ids of all active 
stages.
+
+
+
+int[]
+getJobIdsForGroup(String jobGroup)
+Return a list of all known jobs in a particular job 
group.
+
+
+
+SparkJobInfo
+getJobInfo(int jobId)
+Returns job information, or null if the job 
info could not be found or was garbage collected.
+
+
+
+SparkStageInfo
+getStageInfo(int stageId)
+Returns stage information, or null if the 
stage info could not be found or was
+ garbage collected.
+
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getJobIdsForGroup
+public int[] getJobIdsForGroup(String jobGroup)
+Return a list of all known jobs in a particular job group.  
If jobGroup is null, then
+ returns all known jobs that are not associated with a job group.
+ 
+ The returned list may contain running, failed, and completed jobs, and may 
vary across
+ invocations of this method.  This method does not guarantee the order of the 
elements in
+ its result.
+
+Parameters:
+jobGroup - (undocumented)
+Returns:
+(undocumented)
+
+
+
+
+
+
+
+
+getActiveStageIds
+public int[] getActiveStageIds()
+Returns an array containing the ids of all active stages.
+ 
+ This method does not guarantee the order of the elements in its result.
+
+Returns:
+(undocumented)
+
+
+
+
+
+
+
+
+getActiveJobIds
+public int[] getActiveJobIds()
+Returns an array containing the ids of all active jobs.
+ 
+ This method does not guarantee the order of the elements in its result.
+
+Returns:
+(undocumented)
+
+
+
+
+
+
+
+
+getJobInfo
+public SparkJobInfo getJobInfo(int jobId)
+Returns job information, or null if the job 
info could not be found or was garbage collected.
+
+Parameters:
+jobId - (undocumented)
+Returns:
+(undocumented)
+
+
+
+
+
+
+
+
+getStageInfo
+public SparkStageInfo getStageInfo(int stageId)
+Returns stage information, or null if the 
stage info could not be found or was
+ garbage collected.
+
+Parameters:
+stageId - (undocumented)
+Returns:
+(undocumented)
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Sk

[36/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/write.jdbc.html
--
diff --git a/site/docs/2.3.2/api/R/write.jdbc.html 
b/site/docs/2.3.2/api/R/write.jdbc.html
new file mode 100644
index 000..8906d4d
--- /dev/null
+++ b/site/docs/2.3.2/api/R/write.jdbc.html
@@ -0,0 +1,148 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Save the content of 
SparkDataFrame to an external database...
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+write.jdbc 
{SparkR}R Documentation
+
+Save the content of SparkDataFrame to an external database table via 
JDBC.
+
+Description
+
+Save the content of the SparkDataFrame to an external database table via 
JDBC. Additional JDBC
+database connection properties can be set (...)
+
+
+
+Usage
+
+
+write.jdbc(x, url, tableName, mode = "error", ...)
+
+## S4 method for signature 'SparkDataFrame,character,character'
+write.jdbc(x, url,
+  tableName, mode = "error", ...)
+
+
+
+Arguments
+
+
+x
+
+a SparkDataFrame.
+
+url
+
+JDBC database url of the form jdbc:subprotocol:subname.
+
+tableName
+
+yhe name of the table in the external database.
+
+mode
+
+one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore'
+save mode (it is 'error' by default)
+
+...
+
+additional JDBC database connection properties.
+
+
+
+
+Details
+
+Also, mode is used to specify the behavior of the save operation when
+data already exists in the data source. There are four modes:
+
+
+
+ 'append': Contents of this SparkDataFrame are expected to be appended 
to existing data.
+
+
+ 'overwrite': Existing data is expected to be overwritten by the 
contents of this
+SparkDataFrame.
+
+
+ 'error' or 'errorifexists': An exception is expected to be thrown.
+
+
+ 'ignore': The save operation is expected to not save the contents of 
the SparkDataFrame
+and to not change the existing data.
+
+
+
+
+
+Note
+
+write.jdbc since 2.0.0
+
+
+
+See Also
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+attach,SparkDataFrame-method,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+dtypes, except,
+explain, filter,
+first, gapplyCollect,
+gapply, getNumPartitions,
+group_by, head,
+hint, histogram,
+insertInto, intersect,
+isLocal, isStreaming,
+join, limit,
+localCheckpoint, merge,
+mutate, ncol,
+nrow, persist,
+printSchema, randomSplit,
+rbind, registerTempTable,
+rename, repartition,
+rollup, sample,
+saveAsTable, schema,
+selectExpr, select,
+showDF, show,
+storageLevel, str,
+subset, summary,
+take, toJSON,
+unionByName, union,
+unpersist, withColumn,
+withWatermark, with,
+write.df, write.json,
+write.orc, write.parquet,
+write.stream, write.text
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D jdbcUrl <- "jdbc:mysql://localhost:3306/databasename"
+##D write.jdbc(df, jdbcUrl, "table", user = "username", 
password = "password")
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/write.json.html
--
diff --git a/site/docs/2.3.2/api/R/write.json.html 
b/site/docs/2.3.2/api/R/write.json.html
new file mode 100644
index 000..23eb21b
--- /dev/null
+++ b/site/docs/2.3.2/api/R/write.json.html
@@ -0,0 +1,122 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Save the contents of 
SparkDataFrame as a JSON file
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+write.json 
{SparkR}R Documentation
+
+Save the contents of SparkDataFrame as a JSON file
+
+Description
+
+Save the contents of a SparkDataFrame as a JSON file (http://jsonlines.org/";>
+JSON Lines text format or newline-delimited JSON). Files written out
+with this method can be read back in as a SparkDataFrame using read.json().
+
+
+
+Usage
+
+
+write.json(x, path, ...)
+
+## S4 method for signature 'SparkDataFrame,character'
+write.json(x, path, mode = "error",
+  ...)
+
+
+
+Arguments
+
+
+x
+
+A SparkDataFrame
+
+path
+
+The directory where the file is saved
+
+...
+
+additional argument(s) passed to the method.
+
+mode
+
+one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore'
+save mode (it is 'error' by default)
+
+
+
+
+Not

[33/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/constant-values.html
--
diff --git a/site/docs/2.3.2/api/java/constant-values.html 
b/site/docs/2.3.2/api/java/constant-values.html
new file mode 100644
index 000..0882fdc
--- /dev/null
+++ b/site/docs/2.3.2/api/java/constant-values.html
@@ -0,0 +1,263 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+Constant Field Values (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+
+
+
+Constant Field Values
+Contents
+
+org.apache.*
+
+
+
+
+
+org.apache.*
+
+
+
+org.apache.spark.launcher.SparkLauncher 
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+public static final String
+CHILD_CONNECTION_TIMEOUT
+"spark.launcher.childConectionTimeout"
+
+
+
+
+public static final String
+CHILD_PROCESS_LOGGER_NAME
+"spark.launcher.childProcLoggerName"
+
+
+
+
+public static final String
+DEPLOY_MODE
+"spark.submit.deployMode"
+
+
+
+
+public static final String
+DRIVER_EXTRA_CLASSPATH
+"spark.driver.extraClassPath"
+
+
+
+
+public static final String
+DRIVER_EXTRA_JAVA_OPTIONS
+"spark.driver.extraJavaOptions"
+
+
+
+
+public static final String
+DRIVER_EXTRA_LIBRARY_PATH
+"spark.driver.extraLibraryPath"
+
+
+
+
+public static final String
+DRIVER_MEMORY
+"spark.driver.memory"
+
+
+
+
+public static final String
+EXECUTOR_CORES
+"spark.executor.cores"
+
+
+
+
+public static final String
+EXECUTOR_EXTRA_CLASSPATH
+"spark.executor.extraClassPath"
+
+
+
+
+public static final String
+EXECUTOR_EXTRA_JAVA_OPTIONS
+"spark.executor.extraJavaOptions"
+
+
+
+
+public static final String
+EXECUTOR_EXTRA_LIBRARY_PATH
+"spark.executor.extraLibraryPath"
+
+
+
+
+public static final String
+EXECUTOR_MEMORY
+"spark.executor.memory"
+
+
+
+
+public static final String
+NO_RESOURCE
+"spark-internal"
+
+
+
+
+public static final String
+SPARK_MASTER
+"spark.master"
+
+
+
+
+
+
+
+
+org.apache.spark.util.kvstore.KVIndex 
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+public static final String
+NATURAL_INDEX_NAME
+"__main__"
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/deprecated-list.html
--
diff --git a/site/docs/2.3.2/api/java/deprecated-list.html 
b/site/docs/2.3.2/api/java/deprecated-list.html
new file mode 100644
index 000..9557476
--- /dev/null
+++ b/site/docs/2.3.2/api/java/deprecated-list.html
@@ -0,0 +1,827 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+Deprecated List (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+
+
+
+Deprecated API
+Contents
+
+Deprecated Interfaces
+Deprecated Classes
+Deprecated Methods
+Deprecated Constructors
+
+
+
+
+
+
+
+
+Deprecated Interfaces 
+
+Interface and Description
+
+
+
+org.apache.spark.AccumulableParam
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+org.apache.spark.AccumulatorParam
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+
+
+
+
+
+
+
+
+
+Deprecated Classes 
+
+Class and Description
+
+
+
+org.apache.spark.Accumulable
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+org.apache.spark.Accumulator
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+org.apache.spark.AccumulatorParam.DoubleAccumulatorParam$
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+org.apache.spark.AccumulatorParam.FloatAccumulatorP

[08/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html
new file mode 100644
index 000..415bc32
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html
@@ -0,0 +1,2086 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaRDDLike (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":6,"i6":6,"i7":6,"i8":6,"i9":6,"i10":6,"i11":6,"i12":6,"i13":6,"i14":6,"i15":6,"i16":6,"i17":6,"i18":6,"i19":6,"i20":6,"i21":6,"i22":6,"i23":6,"i24":6,"i25":6,"i26":6,"i27":6,"i28":6,"i29":6,"i30":6,"i31":6,"i32":6,"i33":6,"i34":6,"i35":6,"i36":6,"i37":6,"i38":6,"i39":6,"i40":6,"i41":6,"i42":6,"i43":6,"i44":6,"i45":6,"i46":6,"i47":6,"i48":6,"i49":6,"i50":6,"i51":6,"i52":6,"i53":6,"i54":6,"i55":6,"i56":6,"i57":6,"i58":6,"i59":6,"i60":6,"i61":6,"i62":6,"i63":6,"i64":6,"i65":6,"i66":6,"i67":6,"i68":6,"i69":6,"i70":6,"i71":6,"i72":6,"i73":6,"i74":6,"i75":6,"i76":6,"i77":6,"i78":6,"i79":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Interface 
JavaRDDLike>
+
+
+
+
+
+
+All Superinterfaces:
+java.io.Serializable
+
+
+All Known Implementing Classes:
+JavaDoubleRDD, JavaHadoopRDD, JavaNewHadoopRDD, JavaPairRDD, JavaRDD
+
+
+
+public interface JavaRDDLike>
+extends scala.Serializable
+Defines operations common to several Java RDD 
implementations.
+ 
+
+Note:
+This trait is not intended to be implemented by user code.
+
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Abstract Methods 
+
+Modifier and Type
+Method and Description
+
+
+ U
+aggregate(U zeroValue,
+ Function2 seqOp,
+ Function2 combOp)
+Aggregate the elements of each partition, and then the 
results for all the partitions, using
+ given combine functions and a neutral "zero value".
+
+
+
+ JavaPairRDD
+cartesian(JavaRDDLike other)
+Return the Cartesian product of this RDD and another one, 
that is, the RDD of all pairs of
+ elements (a, b) where a is in this and b is in 
other.
+
+
+
+void
+checkpoint()
+Mark this RDD for checkpointing.
+
+
+
+scala.reflect.ClassTag
+classTag() 
+
+
+java.util.List
+collect()
+Return an array that contains all of the elements in this 
RDD.
+
+
+
+JavaFutureAction>
+collectAsync()
+The asynchronous version of collect, which 
returns a future for
+ retrieving an array containing all of the elements in this RDD.
+
+
+
+java.util.List[]
+collectPartitions(int[] partitionIds)
+Return an array that contains all of the elements in a 
specific partition of this RDD.
+
+
+
+SparkContext
+context()
+The SparkContext that this RDD was created 
on.
+
+
+
+long
+count()
+Return the number of elements in the RDD.
+
+
+
+PartialResult
+countApprox(long timeout)
+Approximate version of count() that returns a potentially 
incomplete result
+ within a timeout, even if not all tasks have finished.
+
+
+
+PartialResult
+countApprox(long timeout,
+   double confidence)
+Approximate version of count() that returns a potentially 
incomplete result
+ within a timeout, even if not all tasks have finished.
+
+
+
+long
+countApproxDistinct(double relativeSD)
+Return approximate number of distinct elements in the 
RDD.
+
+
+
+JavaFutureAction
+countAsync()
+The asynchronous version of count, which 
returns a
+ future for counting the number of elements in this RDD.
+
+
+
+java.util.Map
+countByValue()
+Return the count of each unique value in this RDD as a map 
of (value, count) pairs.
+
+
+
+PartialResult>
+countByValueApprox(long timeout)
+Approximate version of countByValue().
+
+
+
+PartialResult

[15/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html 
b/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html
new file mode 100644
index 000..c1fc564
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html
@@ -0,0 +1,448 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+TaskCommitDenied (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":5,"i2":10,"i3":5,"i4":10,"i5":10,"i6":5,"i7":5,"i8":9,"i9":9,"i10":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],4:["t3","Abstract 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class TaskCommitDenied
+
+
+
+Object
+
+
+org.apache.spark.TaskCommitDenied
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, TaskEndReason, TaskFailedReason, scala.Equals, scala.Product
+
+
+
+public class TaskCommitDenied
+extends Object
+implements TaskFailedReason, scala.Product, 
scala.Serializable
+:: DeveloperApi ::
+ Task requested the driver to commit, but was denied.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+TaskCommitDenied(int jobID,
+int partitionID,
+int attemptNumber) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Abstract Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+int
+attemptNumber() 
+
+
+abstract static boolean
+canEqual(Object that) 
+
+
+boolean
+countTowardsTaskFailures()
+If a task failed because its attempt to commit was denied, 
do not count this failure
+ towards failing the stage.
+
+
+
+abstract static boolean
+equals(Object that) 
+
+
+int
+jobID() 
+
+
+int
+partitionID() 
+
+
+abstract static int
+productArity() 
+
+
+abstract static Object
+productElement(int n) 
+
+
+static 
scala.collection.Iterator
+productIterator() 
+
+
+static String
+productPrefix() 
+
+
+String
+toErrorString()
+Error message displayed in the web UI.
+
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface scala.Product
+productArity, productElement, productIterator, productPrefix
+
+
+
+
+
+Methods inherited from interface scala.Equals
+canEqual, equals
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+TaskCommitDenied
+public TaskCommitDenied(int jobID,
+int partitionID,
+int attemptNumber)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+canEqual
+public abstract static boolean canEqual(Object that)
+
+
+
+
+
+
+
+equals
+public abstract static boolean equals(Object that)
+
+
+
+
+
+
+
+productElement
+public abstract static Object productElement(int n)
+
+
+
+
+
+
+
+productArity
+public abstract static int productArity()
+
+
+
+
+
+
+
+productIterator
+public 
static scala.collection.Iterator productIterator()
+
+
+
+
+
+
+
+productPrefix
+public static String productPrefix()
+
+
+
+
+
+
+
+jobID
+public int jobID()
+
+
+
+
+
+
+
+partitionID
+public int partitionID()
+
+
+
+
+
+
+
+attemptNumber
+public int attemptNumber()
+
+
+
+
+
+
+
+toErrorString
+public String toErrorString()
+Description copied from 
interface: TaskFailedReason
+Error message displayed in the web UI.
+
+Specified by:
+toErrorString in
 interface TaskFailedReason
+
+
+
+
+
+
+
+
+countTowardsTaskFailures
+public boolean countTowardsTaskFailures()
+If a task failed because its attempt to commit was denied, 
do not count this failure
+ towards failing the stage. This is intended to prevent spurious stage 
failures in cases
+ where many speculative tasks are launched and denied to commit.
+
+Specified by:
+countTowardsTaskFailures in
 interface TaskFailedReason
+Returns:
+(undocumented)
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation 

[37/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/summary.html
--
diff --git a/site/docs/2.3.2/api/R/summary.html 
b/site/docs/2.3.2/api/R/summary.html
new file mode 100644
index 000..0cd3241
--- /dev/null
+++ b/site/docs/2.3.2/api/R/summary.html
@@ -0,0 +1,150 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: summary
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+summary 
{SparkR}R Documentation
+
+summary
+
+Description
+
+Computes specified statistics for numeric and string columns. Available 
statistics are:
+
+
+
+ count
+
+
+ mean
+
+
+ stddev
+
+
+ min
+
+
+ max
+
+
+ arbitrary approximate percentiles specified as a percentage (eg, 
"75%")
+
+
+
+If no statistics are given, this function computes count, mean, stddev, min,
+approximate quartiles (percentiles at 25%, 50%, and 75%), and max.
+This function is meant for exploratory data analysis, as we make no guarantee 
about the
+backward compatibility of the schema of the resulting Dataset. If you want to
+programmatically compute summary statistics, use the agg function 
instead.
+
+
+
+Usage
+
+
+summary(object, ...)
+
+## S4 method for signature 'SparkDataFrame'
+summary(object, ...)
+
+
+
+Arguments
+
+
+object
+
+a SparkDataFrame to be summarized.
+
+...
+
+(optional) statistics to be computed for all columns.
+
+
+
+
+Value
+
+A SparkDataFrame.
+
+
+
+Note
+
+summary(SparkDataFrame) since 1.5.0
+
+The statistics provided by summary were change in 2.3.0 use describe for
+previous defaults.
+
+
+
+See Also
+
+describe
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+attach,SparkDataFrame-method,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+dtypes, except,
+explain, filter,
+first, gapplyCollect,
+gapply, getNumPartitions,
+group_by, head,
+hint, histogram,
+insertInto, intersect,
+isLocal, isStreaming,
+join, limit,
+localCheckpoint, merge,
+mutate, ncol,
+nrow, persist,
+printSchema, randomSplit,
+rbind, registerTempTable,
+rename, repartition,
+rollup, sample,
+saveAsTable, schema,
+selectExpr, select,
+showDF, show,
+storageLevel, str,
+subset, take,
+toJSON, unionByName,
+union, unpersist,
+withColumn, withWatermark,
+with, write.df,
+write.jdbc, write.json,
+write.orc, write.parquet,
+write.stream, write.text
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D path <- "path/to/file.json"
+##D df <- read.json(path)
+##D summary(df)
+##D summary(df, "min", "25%", "75%", 
"max")
+##D summary(select(df, "age", "height"))
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/tableNames.html
--
diff --git a/site/docs/2.3.2/api/R/tableNames.html 
b/site/docs/2.3.2/api/R/tableNames.html
new file mode 100644
index 000..bc07dad
--- /dev/null
+++ b/site/docs/2.3.2/api/R/tableNames.html
@@ -0,0 +1,61 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Table Names
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+tableNames 
{SparkR}R Documentation
+
+Table Names
+
+Description
+
+Returns the names of tables in the given database as an array.
+
+
+
+Usage
+
+
+## Default S3 method:
+tableNames(databaseName = NULL)
+
+
+
+Arguments
+
+
+databaseName
+
+(optional) name of the database
+
+
+
+
+Value
+
+a list of table names
+
+
+
+Note
+
+tableNames since 1.4.0
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D tableNames("hive")
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/tableToDF.html
--
diff --git a/site/docs/2.3.2/api/R/tableToDF.html 
b/site/docs/2.3.2/api/R/tableToDF.html
new file mode 100644
index 000..803a0d0
--- /dev/null
+++ b/site/docs/2.3.2/api/R/tableToDF.html
@@ -0,0 +1,67 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Create a SparkDataFrame 
from a SparkSQL table or view
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudfla

[12/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaFutureAction.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaFutureAction.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaFutureAction.html
new file mode 100644
index 000..a43e61c
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaFutureAction.html
@@ -0,0 +1,237 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaFutureAction (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Interface 
JavaFutureAction
+
+
+
+
+
+
+All Superinterfaces:
+java.util.concurrent.Future
+
+
+
+public interface JavaFutureAction
+extends java.util.concurrent.Future
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Abstract Methods 
+
+Modifier and Type
+Method and Description
+
+
+java.util.List
+jobIds()
+Returns the job IDs run by the underlying async 
operation.
+
+
+
+
+
+
+
+Methods inherited from interface java.util.concurrent.Future
+cancel, get, get, isCancelled, isDone
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+jobIds
+java.util.List jobIds()
+Returns the job IDs run by the underlying async operation.
+
+ This returns the current snapshot of the job list. Certain operations may run 
multiple
+ jobs, so multiple calls to this method may return different lists.
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaHadoopRDD.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaHadoopRDD.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaHadoopRDD.html
new file mode 100644
index 000..4d2f3bd
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaHadoopRDD.html
@@ -0,0 +1,339 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaHadoopRDD (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Class 
JavaHadoopRDD
+
+
+
+Object
+
+
+org.apache.spark.api.java.JavaPairRDD
+
+
+org.apache.spark.api.java.JavaHadoopRDD
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, JavaRDDLike,JavaPairRDD>
+
+
+
+public class JavaHadoopRDD
+exte

[48/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/00frame_toc.html
--
diff --git a/site/docs/2.3.2/api/R/00frame_toc.html 
b/site/docs/2.3.2/api/R/00frame_toc.html
new file mode 100644
index 000..f627d53
--- /dev/null
+++ b/site/docs/2.3.2/api/R/00frame_toc.html
@@ -0,0 +1,323 @@
+
+
+
+
+
+R Documentation of SparkR
+
+
+window.onload = function() {
+  var imgs = document.getElementsByTagName('img'), i, img;
+  for (i = 0; i < imgs.length; i++) {
+img = imgs[i];
+// center an image if it is the only element of its parent
+if (img.parentElement.childElementCount === 1)
+  img.parentElement.style.textAlign = 'center';
+  }
+};
+
+
+
+
+
+
+
+* {
+   font-family: "Trebuchet MS", "Lucida Grande", "Lucida Sans Unicode", 
"Lucida Sans", Arial, sans-serif;
+   font-size: 14px;
+}
+body {
+  padding: 0 5px; 
+  margin: 0 auto; 
+  width: 80%;
+  max-width: 60em; /* 960px */
+}
+
+h1, h2, h3, h4, h5, h6 {
+   color: #666;
+}
+h1, h2 {
+   text-align: center;
+}
+h1 {
+   font-size: x-large;
+}
+h2, h3 {
+   font-size: large;
+}
+h4, h6 {
+   font-style: italic;
+}
+h3 {
+   border-left: solid 5px #ddd;
+   padding-left: 5px;
+   font-variant: small-caps;
+}
+
+p img {
+   display: block;
+   margin: auto;
+}
+
+span, code, pre {
+   font-family: Monaco, "Lucida Console", "Courier New", Courier, 
monospace;
+}
+span.acronym {}
+span.env {
+   font-style: italic;
+}
+span.file {}
+span.option {}
+span.pkg {
+   font-weight: bold;
+}
+span.samp{}
+
+dt, p code {
+   background-color: #F7F7F7;
+}
+
+
+
+
+
+
+
+
+SparkR
+
+
+AFTSurvivalRegressionModel-class
+ALSModel-class
+BisectingKMeansModel-class
+DecisionTreeClassificationModel-class
+DecisionTreeRegressionModel-class
+FPGrowthModel-class
+GBTClassificationModel-class
+GBTRegressionModel-class
+GaussianMixtureModel-class
+GeneralizedLinearRegressionModel-class
+GroupedData
+IsotonicRegressionModel-class
+KMeansModel-class
+KSTest-class
+LDAModel-class
+LinearSVCModel-class
+LogisticRegressionModel-class
+MultilayerPerceptronClassificationModel-class
+NaiveBayesModel-class
+RandomForestClassificationModel-class
+RandomForestRegressionModel-class
+SparkDataFrame
+StreamingQuery
+WindowSpec
+alias
+approxQuantile
+arrange
+as.data.frame
+attach
+avg
+awaitTermination
+between
+broadcast
+cache
+cacheTable
+cancelJobGroup
+cast
+checkpoint
+clearCache
+clearJobGroup
+coalesce
+collect
+coltypes
+column
+columnaggregatefunctions
+columncollectionfunctions
+columndatetimediff_functions
+columndatetimefunctions
+columnmathfunctions
+columnmiscfunctions
+columnnonaggregatefunctions
+columnstringfunctions
+columnwindowfunctions
+columnfunctions
+columns
+corr
+count
+cov
+createDataFrame
+createExternalTable-deprecated
+createOrReplaceTempView
+createTable
+crossJoin
+crosstab
+cube
+currentDatabase
+dapply
+dapplyCollect
+describe
+dim
+distinct
+drop
+dropDuplicates
+dropTempTable-deprecated
+dropTempView
+dtypes
+endsWith
+eqnullsafe
+except
+explain
+filter
+first
+fitted
+freqItems
+gapply
+gapplyCollect
+getLocalProperty
+getNumPartitions
+glm
+groupBy
+hashCode
+head
+hint
+histogram
+insertInto
+install.spark
+intersect
+isActive
+isLocal
+isStreaming
+join
+last
+lastProgress
+limit
+listColumns
+listDatabases
+listFunctions
+listTables
+localCheckpoint
+match
+merge
+mutate
+nafunctions
+ncol
+not
+nrow
+orderBy
+otherwise
+over
+partitionBy
+persist
+pivot
+predict
+print.jobj
+print.structField
+print.structType
+printSchema
+queryName
+randomSplit
+rangeBetween
+rbind
+read.df
+read.jdbc
+read.json
+read.ml
+read.orc
+read.parquet
+read.stream
+read.text
+recoverPartitions
+refreshByPath
+refreshTable
+registerTempTable-deprecated
+rename
+repartition
+rollup
+rowsBetween
+sample
+sampleBy
+saveAsTable
+schema
+select
+selectExpr
+setCheckpointDir
+setCurrentDatabase
+setJobDescription
+setJobGroup
+setLocalProperty
+setLogLevel
+show
+showDF
+spark.addFile
+spark.als
+spark.bisectingKmeans
+spark.decisionTree
+spark.fpGrowth
+spark.gaussianMixture
+spark.gbt
+spark.getSparkFiles
+spark.getSparkFilesRootDirectory
+spark.glm
+spark.isoreg
+spark.kmeans
+spark.kstest
+spark.lapply
+spark.lda
+spark.logit
+spark.mlp
+spark.naiveBayes
+spark.randomForest
+spark.survreg
+spark.svmLinear
+sparkR.callJMethod
+sparkR.callJStatic
+sparkR.conf
+sparkR.init-deprecated
+sparkR.newJObject
+sparkR.session
+sparkR.session.stop
+sparkR.uiWebUrl
+sparkR.version
+sparkRHive.init-deprecated
+sparkRSQL.init-deprecated
+sql
+startsWith
+status
+stopQuery
+storageLevel
+str
+structField
+structType
+subset
+substr
+summarize
+summary
+tableNames
+tableToDF
+tables
+take
+toJSON
+uncacheTable
+union
+unionByName
+unpersist
+windowOrderBy
+windowPartitionBy
+with
+withColumn
+withWatermark
+write.df
+write.jdbc
+write.json
+write.ml
+write.orc
+write.parquet
+write.stream
+write.text
+
+
+Generat

[28/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/AccumulatorParam.IntAccumulatorParam$.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/AccumulatorParam.IntAccumulatorParam$.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/AccumulatorParam.IntAccumulatorParam$.html
new file mode 100644
index 000..f6890a4
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/AccumulatorParam.IntAccumulatorParam$.html
@@ -0,0 +1,379 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+AccumulatorParam.IntAccumulatorParam$ (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":42,"i1":42};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"],32:["t6","Deprecated Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class 
AccumulatorParam.IntAccumulatorParam$
+
+
+
+Object
+
+
+org.apache.spark.AccumulatorParam.IntAccumulatorParam$
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, AccumulableParam, AccumulatorParam
+
+
+Enclosing interface:
+AccumulatorParam
+
+
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+public static class AccumulatorParam.IntAccumulatorParam$
+extends Object
+implements AccumulatorParam
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interface org.apache.spark.AccumulatorParam
+AccumulatorParam.DoubleAccumulatorParam$, 
AccumulatorParam.FloatAccumulatorParam$, 
AccumulatorParam.IntAccumulatorParam$, AccumulatorParam.LongAccumulatorParam$, 
AccumulatorParam.StringAccumulatorParam$
+
+
+
+
+
+
+
+
+Field Summary
+
+Fields 
+
+Modifier and Type
+Field and Description
+
+
+static AccumulatorParam.IntAccumulatorParam$
+MODULE$
+Deprecated. 
+Static reference to the singleton instance of this Scala 
object.
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+IntAccumulatorParam$()
+Deprecated. 
+ 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods Deprecated Methods 
+
+Modifier and Type
+Method and Description
+
+
+int
+addInPlace(int t1,
+  int t2)
+Deprecated. 
+ 
+
+
+int
+zero(int initialValue)
+Deprecated. 
+ 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface org.apache.spark.AccumulatorParam
+addAccumulator
+
+
+
+
+
+Methods inherited from interface org.apache.spark.AccumulableParam
+addInPlace,
 zero
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Field Detail
+
+
+
+
+
+MODULE$
+public static final AccumulatorParam.IntAccumulatorParam$ 
MODULE$
+Deprecated. 
+Static reference to the singleton instance of this Scala 
object.
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+IntAccumulatorParam$
+public IntAccumulatorParam$()
+Deprecated. 
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+addInPlace
+public int addInPlace(int t1,
+  int t2)
+Deprecated. 
+
+
+
+
+
+
+
+zero
+public int zero(int initialValue)
+Deprecated. 
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/AccumulatorParam.LongAccumulatorParam$.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/AccumulatorParam.LongAccumulatorParam$.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/AccumulatorPa

[50/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/_layouts/global.html
--
diff --git a/_layouts/global.html b/_layouts/global.html
index 874138f..03a34db 100644
--- a/_layouts/global.html
+++ b/_layouts/global.html
@@ -121,7 +121,7 @@
   Documentation 
 
 
-  Latest Release (Spark 
2.3.1)
+  Latest Release (Spark 
2.3.2)
   Older Versions and 
Other Resources
   Frequently Asked 
Questions
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/documentation.md
--
diff --git a/documentation.md b/documentation.md
index 8a9b62d..197d4e4 100644
--- a/documentation.md
+++ b/documentation.md
@@ -12,6 +12,7 @@ navigation:
 Setup instructions, programming guides, and other documentation are 
available for each stable version of Spark below:
 
 
+  Spark 2.3.2
   Spark 2.3.1
   Spark 2.3.0
   Spark 2.2.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/js/downloads.js
--
diff --git a/js/downloads.js b/js/downloads.js
index d5ab599..6de20f0 100644
--- a/js/downloads.js
+++ b/js/downloads.js
@@ -29,6 +29,7 @@ var packagesV7 = [hadoop2p7, hadoop2p6, hadoop2p4, hadoop2p3, 
hadoopFree, source
 // 2.2.0+
 var packagesV8 = [hadoop2p7, hadoop2p6, hadoopFree, sources];
 
+addRelease("2.3.2", new Date("09/24/2018"), packagesV8, true, true);
 addRelease("2.3.1", new Date("06/08/2018"), packagesV8, true, true);
 addRelease("2.3.0", new Date("02/28/2018"), packagesV8, true, false);
 addRelease("2.2.2", new Date("07/02/2018"), packagesV8, true, true);

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/releases/_posts/2018-09-25-spark-release-2-3-2.md
--
diff --git a/releases/_posts/2018-09-25-spark-release-2-3-2.md 
b/releases/_posts/2018-09-25-spark-release-2-3-2.md
new file mode 100644
index 000..fa1f1af
--- /dev/null
+++ b/releases/_posts/2018-09-25-spark-release-2-3-2.md
@@ -0,0 +1,23 @@
+---
+layout: post
+title: Spark Release 2.3.2
+categories: []
+tags: []
+status: publish
+type: post
+published: true
+meta:
+  _edit_last: '4'
+  _wpas_done_all: '1'
+---
+
+Spark 2.3.2 is a maintenance release containing stability fixes. This release 
is based on the branch-2.3 maintenance branch of Spark. We strongly recommend 
all 2.3.x users to upgrade to this stable release.
+
+You can consult JIRA for the [detailed 
changes](https://s.apache.org/spark-2.3.2).
+
+### Know issues
+
+ - **SQL**
+   - SPARK-25206: wrong records are returned when Hive metastore schema and 
parquet schema are in  different letter cases
+
+We would like to acknowledge all community members for contributing patches to 
this release.

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/committers.html
--
diff --git a/site/committers.html b/site/committers.html
index 3ec90dd..003a303 100644
--- a/site/committers.html
+++ b/site/committers.html
@@ -106,7 +106,7 @@
   Documentation 
 
 
-  Latest Release (Spark 2.3.1)
+  Latest Release (Spark 2.3.2)
   Older Versions and Other 
Resources
   Frequently Asked Questions
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/community.html
--
diff --git a/site/community.html b/site/community.html
index da7e47e..349a8d4 100644
--- a/site/community.html
+++ b/site/community.html
@@ -106,7 +106,7 @@
   Documentation 
 
 
-  Latest Release (Spark 2.3.1)
+  Latest Release (Spark 2.3.2)
   Older Versions and Other 
Resources
   Frequently Asked Questions
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/contributing.html
--
diff --git a/site/contributing.html b/site/contributing.html
index 1dd4823..2dac454 100644
--- a/site/contributing.html
+++ b/site/contributing.html
@@ -106,7 +106,7 @@
   Documentation 
 
 
-  Latest Release (Spark 2.3.1)
+  Latest Release (Spark 2.3.2)
   Older Versions and Other 
Resources
   Frequently Asked Questions
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/developer-tools.html
--
diff --git a/site/developer-tools.html b/site/developer-tools.html
index e2b9217..54d848c 100644
--- a/site/developer-tools.html
+++ b/site/developer-tools.html
@@ -106,7 +106,7 @@
   Documentation 
 
 
-  Latest Release (Spark 2.3.1)
+ 

[13/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaDoubleRDD.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaDoubleRDD.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaDoubleRDD.html
new file mode 100644
index 000..a65773f
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaDoubleRDD.html
@@ -0,0 +1,2216 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaDoubleRDD (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":9,"i1":10,"i2":9,"i3":9,"i4":10,"i5":10,"i6":10,"i7":9,"i8":9,"i9":9,"i10":9,"i11":9,"i12":9,"i13":9,"i14":9,"i15":9,"i16":9,"i17":9,"i18":9,"i19":10,"i20":10,"i21":10,"i22":10,"i23":9,"i24":9,"i25":9,"i26":9,"i27":9,"i28":9,"i29":9,"i30":9,"i31":9,"i32":9,"i33":9,"i34":9,"i35":9,"i36":9,"i37":9,"i38":10,"i39":10,"i40":10,"i41":9,"i42":10,"i43":9,"i44":9,"i45":9,"i46":9,"i47":9,"i48":9,"i49":9,"i50":9,"i51":9,"i52":9,"i53":9,"i54":9,"i55":9,"i56":9,"i57":9,"i58":10,"i59":10,"i60":10,"i61":10,"i62":10,"i63":9,"i64":9,"i65":9,"i66":10,"i67":9,"i68":9,"i69":9,"i70":9,"i71":9,"i72":10,"i73":10,"i74":10,"i75":9,"i76":10,"i77":10,"i78":10,"i79":10,"i80":10,"i81":9,"i82":9,"i83":9,"i84":10,"i85":10,"i86":10,"i87":10,"i88":10,"i89":10,"i90":10,"i91":10,"i92":10,"i93":10,"i94":9,"i95":9,"i96":9,"i97":9,"i98":9,"i99":9,"i100":9,"i101":9,"i102":9,"i103":9,"i104":9,"i105":9,"i106":9,"i107":9,"i108":9,"i109":10,"i110":10,"i111":10,"i112":10,"i113":10,"i114":9,"i115":9,"i116":
 9,"i117":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Class JavaDoubleRDD
+
+
+
+Object
+
+
+org.apache.spark.api.java.JavaDoubleRDD
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, JavaRDDLike
+
+
+
+public class JavaDoubleRDD
+extends Object
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+JavaDoubleRDD(RDD srdd) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+static  U
+aggregate(U zeroValue,
+ Function2 seqOp,
+ Function2 combOp) 
+
+
+JavaDoubleRDD
+cache()
+Persist this RDD with the default storage level 
(MEMORY_ONLY).
+
+
+
+static  JavaPairRDD
+cartesian(JavaRDDLike other) 
+
+
+static void
+checkpoint() 
+
+
+scala.reflect.ClassTag
+classTag() 
+
+
+JavaDoubleRDD
+coalesce(int numPartitions)
+Return a new RDD that is reduced into 
numPartitions partitions.
+
+
+
+JavaDoubleRDD
+coalesce(int numPartitions,
+boolean shuffle)
+Return a new RDD that is reduced into 
numPartitions partitions.
+
+
+
+static java.util.List
+collect() 
+
+
+static JavaFutureAction>
+collectAsync() 
+
+
+static java.util.List[]
+collectPartitions(int[] partitionIds) 
+
+
+static SparkContext
+context() 
+
+
+static long
+count() 
+
+
+static PartialResult
+countApprox(long timeout) 
+
+
+static PartialResult
+countApprox(long timeout,
+   double confidence) 
+
+
+static long
+countApproxDistinct(double relativeSD) 
+
+
+static JavaFutureAction
+countAsync() 
+
+
+static java.util.Map
+countByValue() 
+
+
+static PartialResult>
+countByValueApprox(long timeout) 
+
+
+static PartialResult>
+countByValueApprox(long timeout,
+  double confidence) 
+
+
+JavaDoubleRDD
+distinct()
+Return a new RDD containing the distinct elements in this 
RDD.
+
+
+
+JavaDoubleRDD
+distinct(int numPartitions)
+Return a new RDD containing the distinct elements in this 
RDD.
+
+
+
+JavaDoubleRDD
+filter(Function f)
+Return a new RDD containing only the elements that satisfy 
a predicate.
+
+
+
+Double
+first()
+Return the first element i

[14/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/TaskKilledException.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/TaskKilledException.html 
b/site/docs/2.3.2/api/java/org/apache/spark/TaskKilledException.html
new file mode 100644
index 000..d92a973
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/TaskKilledException.html
@@ -0,0 +1,313 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+TaskKilledException (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class 
TaskKilledException
+
+
+
+Object
+
+
+Throwable
+
+
+Exception
+
+
+RuntimeException
+
+
+org.apache.spark.TaskKilledException
+
+
+
+
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable
+
+
+
+public class TaskKilledException
+extends RuntimeException
+:: DeveloperApi ::
+ Exception thrown when a task is explicitly killed (i.e., task failure is 
expected).
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+TaskKilledException() 
+
+
+TaskKilledException(String reason) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+String
+reason() 
+
+
+
+
+
+
+Methods inherited from class Throwable
+addSuppressed, fillInStackTrace, getCause, getLocalizedMessage, 
getMessage, getStackTrace, getSuppressed, initCause, printStackTrace, 
printStackTrace, printStackTrace, setStackTrace, toString
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+TaskKilledException
+public TaskKilledException(String reason)
+
+
+
+
+
+
+
+TaskKilledException
+public TaskKilledException()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+reason
+public String reason()
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/TaskResultLost.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/TaskResultLost.html 
b/site/docs/2.3.2/api/java/org/apache/spark/TaskResultLost.html
new file mode 100644
index 000..67e2fc4
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/TaskResultLost.html
@@ -0,0 +1,363 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+TaskResultLost (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":5,"i1":9,"i2":5,"i3":5,"i4":5,"i5":9,"i6":9,"i7":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],4:["t3","Abstract Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+var methods = 
{"i0":5,"i1":10,"i2":10,"i3":10,"i4":5,"i5":10,"i6":10,"i7":5,"i8":5,"i9":9,"i10":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],4:["t3","Abstract 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class Aggregator
+
+
+
+Object
+
+
+org.apache.spark.Aggregator
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, scala.Equals, scala.Product
+
+
+
+public class Aggregator
+extends Object
+implements scala.Product, scala.Serializable
+:: DeveloperApi ::
+ A set of functions used to aggregate data.
+ 
+ param:  createCombiner function to create the initial value of the 
aggregation.
+ param:  mergeValue function to merge a new value into the aggregation result.
+ param:  mergeCombiners function to merge outputs from multiple mergeValue 
function.
+
+See Also:
+Serialized 
Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+Aggregator(scala.Function1 createCombiner,
+  scala.Function2 mergeValue,
+  scala.Function2 mergeCombiners) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Abstract Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+abstract static boolean
+canEqual(Object that) 
+
+
+scala.collection.Iterator>
+combineCombinersByKey(scala.collection.Iterator> iter,
+ TaskContext context) 
+
+
+scala.collection.Iterator>
+combineValuesByKey(scala.collection.Iterator> iter,
+  TaskContext context) 
+
+
+scala.Function1
+createCombiner() 
+
+
+abstract static boolean
+equals(Object that) 
+
+
+scala.Function2
+mergeCombiners() 
+
+
+scala.Function2
+mergeValue() 
+
+
+abstract static int
+productArity() 
+
+
+abstract static Object
+productElement(int n) 
+
+
+static 
scala.collection.Iterator
+productIterator() 
+
+
+static String
+productPrefix() 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface scala.Product
+productArity, productElement, productIterator, productPrefix
+
+
+
+
+
+Methods inherited from interface scala.Equals
+canEqual, equals
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+Aggregator
+public Aggregator(scala.Function1 createCombiner,
+  scala.Function2 mergeValue,
+  scala.Function2 mergeCombiners)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+canEqual
+public abstract static boolean canEqual(Object that)
+
+
+
+
+
+
+
+equals
+public abstract static boolean equals(Object that)
+
+
+
+
+
+
+
+productElement
+public abstract static Object productElement(int n)
+
+
+
+
+
+
+
+productArity
+public abstract static int productArity()
+
+
+
+
+
+
+
+productIterator
+public 
static scala.collection.Iterator productIterator()
+
+
+
+
+
+
+
+productPrefix
+public static String productPrefix()
+
+
+
+
+
+
+
+createCombiner
+public scala.Function1 createCombiner()
+
+
+
+
+
+
+
+mergeValue
+public scala.Function2 mergeValue()
+
+
+
+
+
+
+
+mergeCombiners
+public scala.Function2 mergeCombiners()
+
+
+
+
+
+
+
+combineValuesByKey
+public scala.collection.Iterator> combineValuesByKey(scala.collection.Iterator> iter,
+ 

[30/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/lib/jquery.js
--
diff --git a/site/docs/2.3.2/api/java/lib/jquery.js 
b/site/docs/2.3.2/api/java/lib/jquery.js
new file mode 100644
index 000..bc3fbc8
--- /dev/null
+++ b/site/docs/2.3.2/api/java/lib/jquery.js
@@ -0,0 +1,2 @@
+/*! jQuery v1.8.2 jquery.com | jquery.org/license */
+(function(a,b){function G(a){var b=F[a]={};return 
p.each(a.split(s),function(a,c){b[c]=!0}),b}function 
J(a,c,d){if(d===b&&a.nodeType===1){var 
e="data-"+c.replace(I,"-$1").toLowerCase();d=a.getAttribute(e);if(typeof 
d=="string"){try{d=d==="true"?!0:d==="false"?!1:d==="null"?null:+d+""===d?+d:H.test(d)?p.parseJSON(d):d}catch(f){}p.data(a,c,d)}else
 d=b}return d}function K(a){var b;for(b in 
a){if(b==="data"&&p.isEmptyObject(a[b]))continue;if(b!=="toJSON")return!1}return!0}function
 ba(){return!1}function bb(){return!0}function 
bh(a){return!a||!a.parentNode||a.parentNode.nodeType===11}function bi(a,b){do 
a=a[b];while(a&&a.nodeType!==1);return a}function 
bj(a,b,c){b=b||0;if(p.isFunction(b))return p.grep(a,function(a,d){var 
e=!!b.call(a,d,a);return e===c});if(b.nodeType)return 
p.grep(a,function(a,d){return a===b===c});if(typeof b=="string"){var 
d=p.grep(a,function(a){return a.nodeType===1});if(be.test(b))return 
p.filter(b,d,!c);b=p.filter(b,d)}return p.grep(a,function(a,d){return p.inArray(
 a,b)>=0===c})}function bk(a){var 
b=bl.split("|"),c=a.createDocumentFragment();if(c.createElement)while(b.length)c.createElement(b.pop());return
 c}function bC(a,b){return 
a.getElementsByTagName(b)[0]||a.appendChild(a.ownerDocument.createElement(b))}function
 bD(a,b){if(b.nodeType!==1||!p.hasData(a))return;var 
c,d,e,f=p._data(a),g=p._data(b,f),h=f.events;if(h){delete 
g.handle,g.events={};for(c in 
h)for(d=0,e=h[c].length;d").appendTo(e.body),c=b.css("display");b.remove();if(c==="none"||c===""){bI=e.body.appendChild(bI||p.extend(e.createElement("iframe"),{frameBorder:0,width:0,height:0}));if(!bJ||!bI.
 
createElement)bJ=(bI.contentWindow||bI.contentDocument).document,bJ.write(""),bJ.close();b=bJ.body.appendChild(bJ.createElement(a)),c=bH(b,"display"),e.body.removeChild(bI)}return
 bS[a]=c,c}function ci(a,b,c,d){var 
e;if(p.isArray(b))p.each(b,function(b,e){c||ce.test(a)?d(a,e):ci(a+"["+(typeof 
e=="object"?b:"")+"]",e,c,d)});else if(!c&&p.type(b)==="object")for(e in 
b)ci(a+"["+e+"]",b[e],c,d);else d(a,b)}function cz(a){return 
function(b,c){typeof b!="string"&&(c=b,b="*");var 
d,e,f,g=b.toLowerCase().split(s),h=0,i=g.length;if(p.isFunction(c))for(;h)[^>]*$|#([\w\-]*)$)/,v=/^<(\w+)\s*\/?>(?:<\/\1>|)$/,w=/^[\],:{}\s]*$/,x=/(?:^|:|,)(?:\s*\[)+/g,y=/\\(?:["\\\/bfnrt]|u[\da-fA-F]{4})/g,z=/"[^"\\\r\n]*"|true|false|null|-?(?:\d\d*\.|)\d+(?:[eE][\-+]?\d+|)/g,A=/^-ms-/,B=/-([\da-z])/gi,C=function(a,b){return(b+"").toUpperCase()},D=function(){e.addEventListener?(e.removeEventListener("DOMContentLoaded",D,!1),p.ready()):e.readyState==="complete"&&(e.detachEvent("onreadystatechange",D),p.ready())},E={};p.fn=p.prototype={constructor:p,init
 :function(a,c,d){var f,g,h,i;if(!a)return this;if(a.nodeType)return 
this.context=this[0]=a,this.length=1,this;if(typeof 
a=="string"){a.charAt(0)==="<"&&a.charAt(a.length-1)===">"&&a.length>=3?f=[null,a,null]:f=u.exec(a);if(f&&(f[1]||!c)){if(f[1])return
 c=c instanceof 
p?c[0]:c,i=c&&c.nodeType?c.ownerDocument||c:e,a=p.parseHTML(f[1],i,!0),v.test(f[1])&&p.isPlainObject(c)&&this.attr.call(a,c,!0),p.merge(this,a);g=e.getElementById(f[2]);if(g&&g.parentNode){if(g.id!==f[2])return
 d.find(a);this.length=1,this[0]=g}return 
this.context=e,this.selector=a,this}return!c||c.jquery?(c||d).find(a):this.constructor(c).find(a)}return
 
p.isFunction(a)?d.ready(a):(a.selector!==b&&(this.selector=a.selector,this.context=a.context),p.makeArray(a,this))},selector:"",jquery:"1.8.2",length:0,size:function(){return
 this.length},toArray:function(){return k.call(this)},get:function(a){return 
a==null?this.toArray():a<0?this[this.length+a]:this[a]},pushStack:function(a,b,c){var
 d=p.merge(this.constructor(),a);ret
 urn 
d.prevObject=this,d.context=this.context,b==="find"?d.selector=this.selector+(this.selector?"
 
":"")+c:b&&(d.selector=this.selector+"."+b+"("+c+")"),d},each:function(a,b){return
 p.each(this,a,b)},ready:function(a){return 
p.ready.promise().done(a),this},eq:function(a){return 
a=+a,a===-1?this.slice(a):this.slice(a,a+1)},first:function(){return 
this.eq(0)},last:function(){return this.eq(-1)},slice:function(){return 
this.pushStack(k.apply(this,arguments),"slice",k.call(arguments).join(","))},map:function(a){return
 this.pushStack(p.map(this,function(b,c){return 
a.call(b,c,b)}))},end:function(){return 
this.prevObject||this.constructor(null)},push:j,sort:[].sort,splice:[].splice},p.fn.init.prototype=p.fn,p.extend=p.fn.extend=function(){var
 a,c,d,e,f,g,h=arguments[0]||{},i=1,j=arguments.length,k=!1;typeof 
h=="boolean"&&(k=h,h=arguments[1]||{},i=2),type

[26/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/CleanShuffle.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/CleanShuffle.html 
b/site/docs/2.3.2/api/java/org/apache/spark/CleanShuffle.html
new file mode 100644
index 000..0bf6dfe
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/CleanShuffle.html
@@ -0,0 +1,370 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+CleanShuffle (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":5,"i1":5,"i2":5,"i3":5,"i4":9,"i5":9,"i6":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],4:["t3","Abstract 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class CleanShuffle
+
+
+
+Object
+
+
+org.apache.spark.CleanShuffle
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, CleanupTask, scala.Equals, 
scala.Product
+
+
+
+public class CleanShuffle
+extends Object
+implements CleanupTask, scala.Product, 
scala.Serializable
+
+See Also:
+Serialized 
Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+CleanShuffle(int shuffleId) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Abstract Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+abstract static boolean
+canEqual(Object that) 
+
+
+abstract static boolean
+equals(Object that) 
+
+
+abstract static int
+productArity() 
+
+
+abstract static Object
+productElement(int n) 
+
+
+static 
scala.collection.Iterator
+productIterator() 
+
+
+static String
+productPrefix() 
+
+
+int
+shuffleId() 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface scala.Product
+productArity, productElement, productIterator, productPrefix
+
+
+
+
+
+Methods inherited from interface scala.Equals
+canEqual, equals
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+CleanShuffle
+public CleanShuffle(int shuffleId)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+canEqual
+public abstract static boolean canEqual(Object that)
+
+
+
+
+
+
+
+equals
+public abstract static boolean equals(Object that)
+
+
+
+
+
+
+
+productElement
+public abstract static Object productElement(int n)
+
+
+
+
+
+
+
+productArity
+public abstract static int productArity()
+
+
+
+
+
+
+
+productIterator
+public 
static scala.collection.Iterator productIterator()
+
+
+
+
+
+
+
+productPrefix
+public static String productPrefix()
+
+
+
+
+
+
+
+shuffleId
+public int shuffleId()
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/CleanupTask.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/CleanupTask.html 
b/site/docs/2.3.2/api/java/org/apache/spark/CleanupTask.html
new file mode 100644
index 000..9634781
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/CleanupTask.html
@@ -0,0 +1,170 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+CleanupTask (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package

[43/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/head.html
--
diff --git a/site/docs/2.3.2/api/R/head.html b/site/docs/2.3.2/api/R/head.html
new file mode 100644
index 000..30bee19
--- /dev/null
+++ b/site/docs/2.3.2/api/R/head.html
@@ -0,0 +1,115 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Head
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+head 
{SparkR}R Documentation
+
+Head
+
+Description
+
+Return the first num rows of a SparkDataFrame as a R 
data.frame. If num is not
+specified, then head() returns the first 6 rows as with R data.frame.
+
+
+
+Usage
+
+
+## S4 method for signature 'SparkDataFrame'
+head(x, num = 6L)
+
+
+
+Arguments
+
+
+x
+
+a SparkDataFrame.
+
+num
+
+the number of rows to return. Default is 6.
+
+
+
+
+Value
+
+A data.frame.
+
+
+
+Note
+
+head since 1.4.0
+
+
+
+See Also
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+attach,SparkDataFrame-method,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+dtypes, except,
+explain, filter,
+first, gapplyCollect,
+gapply, getNumPartitions,
+group_by, hint,
+histogram, insertInto,
+intersect, isLocal,
+isStreaming, join,
+limit, localCheckpoint,
+merge, mutate,
+ncol, nrow,
+persist, printSchema,
+randomSplit, rbind,
+registerTempTable, rename,
+repartition, rollup,
+sample, saveAsTable,
+schema, selectExpr,
+select, showDF,
+show, storageLevel,
+str, subset,
+summary, take,
+toJSON, unionByName,
+union, unpersist,
+withColumn, withWatermark,
+with, write.df,
+write.jdbc, write.json,
+write.orc, write.parquet,
+write.stream, write.text
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D path <- "path/to/file.json"
+##D df <- read.json(path)
+##D head(df)
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/hint.html
--
diff --git a/site/docs/2.3.2/api/R/hint.html b/site/docs/2.3.2/api/R/hint.html
new file mode 100644
index 000..11e12d6
--- /dev/null
+++ b/site/docs/2.3.2/api/R/hint.html
@@ -0,0 +1,120 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: hint
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+hint 
{SparkR}R Documentation
+
+hint
+
+Description
+
+Specifies execution plan hint and return a new SparkDataFrame.
+
+
+
+Usage
+
+
+hint(x, name, ...)
+
+## S4 method for signature 'SparkDataFrame,character'
+hint(x, name, ...)
+
+
+
+Arguments
+
+
+x
+
+a SparkDataFrame.
+
+name
+
+a name of the hint.
+
+...
+
+optional parameters for the hint.
+
+
+
+
+Value
+
+A SparkDataFrame.
+
+
+
+Note
+
+hint since 2.2.0
+
+
+
+See Also
+
+Other SparkDataFrame functions: SparkDataFrame-class,
+agg, alias,
+arrange, as.data.frame,
+attach,SparkDataFrame-method,
+broadcast, cache,
+checkpoint, coalesce,
+collect, colnames,
+coltypes,
+createOrReplaceTempView,
+crossJoin, cube,
+dapplyCollect, dapply,
+describe, dim,
+distinct, dropDuplicates,
+dropna, drop,
+dtypes, except,
+explain, filter,
+first, gapplyCollect,
+gapply, getNumPartitions,
+group_by, head,
+histogram, insertInto,
+intersect, isLocal,
+isStreaming, join,
+limit, localCheckpoint,
+merge, mutate,
+ncol, nrow,
+persist, printSchema,
+randomSplit, rbind,
+registerTempTable, rename,
+repartition, rollup,
+sample, saveAsTable,
+schema, selectExpr,
+select, showDF,
+show, storageLevel,
+str, subset,
+summary, take,
+toJSON, unionByName,
+union, unpersist,
+withColumn, withWatermark,
+with, write.df,
+write.jdbc, write.json,
+write.orc, write.parquet,
+write.stream, write.text
+
+
+
+Examples
+
+## Not run: 
+##D df <- createDataFrame(mtcars)
+##D avg_mpg <- mean(groupBy(createDataFrame(mtcars), "cyl"), 
"mpg")
+##D 
+##D head(join(df, hint(avg_mpg, "broadcast"), df$cyl == avg_mpg$cyl))
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/histogram.html
--
diff --git a/site/docs/2.3.2/api/R/histogram.html 
b/site/docs/2.3.2/api/R/histogram.html
new file mode 100644
index 000..375f

[02/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/package-summary.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/package-summary.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/package-summary.html
new file mode 100644
index 000..eb84b4f
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/package-summary.html
@@ -0,0 +1,300 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+org.apache.spark.api.java.function (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Package
+Next Package
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+
+
+
+Package org.apache.spark.api.java.function
+
+Set of interfaces to represent functions in Spark's Java 
API.
+
+See: Description
+
+
+
+
+
+Interface Summary 
+
+Interface
+Description
+
+
+
+CoGroupFunction
+
+A function that returns zero or more output records from 
each grouping key and its values from 2
+ Datasets.
+
+
+
+DoubleFlatMapFunction
+
+A function that returns zero or more records of type Double 
from each input record.
+
+
+
+DoubleFunction
+
+A function that returns Doubles, and can be used to 
construct DoubleRDDs.
+
+
+
+FilterFunction
+
+Base interface for a function used in Dataset's filter 
function.
+
+
+
+FlatMapFunction
+
+A function that returns zero or more output records from 
each input record.
+
+
+
+FlatMapFunction2
+
+A function that takes two inputs and returns zero or more 
output records.
+
+
+
+FlatMapGroupsFunction
+
+A function that returns zero or more output records from 
each grouping key and its values.
+
+
+
+FlatMapGroupsWithStateFunction
+
+::Experimental::
+ Base interface for a map function used in
+ org.apache.spark.sql.KeyValueGroupedDataset.flatMapGroupsWithState(
+ FlatMapGroupsWithStateFunction, org.apache.spark.sql.streaming.OutputMode,
+ org.apache.spark.sql.Encoder, org.apache.spark.sql.Encoder)
+
+
+
+ForeachFunction
+
+Base interface for a function used in Dataset's foreach 
function.
+
+
+
+ForeachPartitionFunction
+
+Base interface for a function used in Dataset's 
foreachPartition function.
+
+
+
+Function
+
+Base interface for functions whose return types do not 
create special RDDs.
+
+
+
+Function0
+
+A zero-argument function that returns an R.
+
+
+
+Function2
+
+A two-argument function that takes arguments of type T1 and 
T2 and returns an R.
+
+
+
+Function3
+
+A three-argument function that takes arguments of type T1, 
T2 and T3 and returns an R.
+
+
+
+Function4
+
+A four-argument function that takes arguments of type T1, 
T2, T3 and T4 and returns an R.
+
+
+
+MapFunction
+
+Base interface for a map function used in Dataset's map 
function.
+
+
+
+MapGroupsFunction
+
+Base interface for a map function used in GroupedDataset's 
mapGroup function.
+
+
+
+MapGroupsWithStateFunction
+
+::Experimental::
+ Base interface for a map function used in
+ KeyValueGroupedDataset.mapGroupsWithState(
+ MapGroupsWithStateFunction, org.apache.spark.sql.Encoder, 
org.apache.spark.sql.Encoder)
+
+
+
+MapPartitionsFunction
+
+Base interface for function used in Dataset's 
mapPartitions.
+
+
+
+PairFlatMapFunction
+
+A function that returns zero or more key-value pair records 
from each input record.
+
+
+
+PairFunction
+
+A function that returns key-value pairs (Tuple2), and can be used to
+ construct PairRDDs.
+
+
+
+ReduceFunction
+
+Base interface for function used in Dataset's reduce.
+
+
+
+VoidFunction
+
+A function with no return value.
+
+
+
+VoidFunction2
+
+A two-argument function that takes arguments of type T1 and 
T2 with no return value.
+
+
+
+
+
+
+
+
+
+Package 
org.apache.spark.api.java.function Description
+Set of interfaces to represent functions in Spark's Java 
API. Users create implementations of
+ these interfaces to pass functions to various Java API methods for Spark. 
Please visit Spark's
+ Java programming guide for more details.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Package
+Next Package
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+var methods = {"i0":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java.function
+Interface 
MapGroupsWithStateFunction
+
+
+
+
+
+
+All Superinterfaces:
+java.io.Serializable
+
+
+
+@InterfaceStability.Evolving
+public interface MapGroupsWithStateFunction
+extends java.io.Serializable
+::Experimental::
+ Base interface for a map function used in
+ KeyValueGroupedDataset.mapGroupsWithState(
+ MapGroupsWithStateFunction, org.apache.spark.sql.Encoder, 
org.apache.spark.sql.Encoder)
+
+Since:
+2.1.1
+
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Abstract Methods 
+
+Modifier and Type
+Method and Description
+
+
+R
+call(K key,
+java.util.Iterator values,
+GroupState state) 
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+
+
+call
+R call(K key,
+   java.util.Iterator values,
+   GroupState state)
+throws Exception
+
+Throws:
+Exception
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/MapPartitionsFunction.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/MapPartitionsFunction.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/MapPartitionsFunction.html
new file mode 100644
index 000..ec17a18
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/MapPartitionsFunction.html
@@ -0,0 +1,235 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+MapPartitionsFunction (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java.function
+Interface 
MapPartitionsFunction
+
+
+
+
+
+
+All Superinterfaces:
+java.io.Serializable
+
+
+Functional Inte

[16/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SparkStageInfoImpl.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/SparkStageInfoImpl.html 
b/site/docs/2.3.2/api/java/org/apache/spark/SparkStageInfoImpl.html
new file mode 100644
index 000..2924707
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/SparkStageInfoImpl.html
@@ -0,0 +1,415 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+SparkStageInfoImpl (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class 
SparkStageInfoImpl
+
+
+
+Object
+
+
+org.apache.spark.SparkStageInfoImpl
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, SparkStageInfo
+
+
+
+public class SparkStageInfoImpl
+extends Object
+implements SparkStageInfo
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+SparkStageInfoImpl(int stageId,
+  int currentAttemptId,
+  long submissionTime,
+  String name,
+  int numTasks,
+  int numActiveTasks,
+  int numCompletedTasks,
+  int numFailedTasks) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+int
+currentAttemptId() 
+
+
+String
+name() 
+
+
+int
+numActiveTasks() 
+
+
+int
+numCompletedTasks() 
+
+
+int
+numFailedTasks() 
+
+
+int
+numTasks() 
+
+
+int
+stageId() 
+
+
+long
+submissionTime() 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+SparkStageInfoImpl
+public SparkStageInfoImpl(int stageId,
+  int currentAttemptId,
+  long submissionTime,
+  String name,
+  int numTasks,
+  int numActiveTasks,
+  int numCompletedTasks,
+  int numFailedTasks)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+stageId
+public int stageId()
+
+Specified by:
+stageId in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+currentAttemptId
+public int currentAttemptId()
+
+Specified by:
+currentAttemptId in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+submissionTime
+public long submissionTime()
+
+Specified by:
+submissionTime in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+name
+public String name()
+
+Specified by:
+name in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+numTasks
+public int numTasks()
+
+Specified by:
+numTasks in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+numActiveTasks
+public int numActiveTasks()
+
+Specified by:
+numActiveTasks in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+numCompletedTasks
+public int numCompletedTasks()
+
+Specified by:
+numCompletedTasks in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+numFailedTasks
+public int numFailedTasks()
+
+Specified by:
+numFailedTasks in
 interface SparkStageInfo
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SparkStatusTracker.html
--
diff --git a/site/docs/2.3.2/api/java/org

[11/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaNewHadoopRDD.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaNewHadoopRDD.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaNewHadoopRDD.html
new file mode 100644
index 000..f0715cd
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaNewHadoopRDD.html
@@ -0,0 +1,339 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaNewHadoopRDD (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Class 
JavaNewHadoopRDD
+
+
+
+Object
+
+
+org.apache.spark.api.java.JavaPairRDD
+
+
+org.apache.spark.api.java.JavaNewHadoopRDD
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, JavaRDDLike,JavaPairRDD>
+
+
+
+public class JavaNewHadoopRDD
+extends JavaPairRDD
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+JavaNewHadoopRDD(NewHadoopRDD rdd,
+scala.reflect.ClassTag kClassTag,
+scala.reflect.ClassTag vClassTag) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+scala.reflect.ClassTag
+kClassTag() 
+
+
+ JavaRDD
+mapPartitionsWithInputSplit(Function2>,java.util.Iterator> f,
+   boolean preservesPartitioning)
+Maps over a partition, providing the InputSplit that was 
used as the base of the partition.
+
+
+
+scala.reflect.ClassTag
+vClassTag() 
+
+
+
+
+
+
+Methods inherited from class org.apache.spark.api.java.JavaPairRDD
+aggregate,
 aggregateByKey,
 aggregateByKey,
 aggregateByKey,
 cache,
 cartesian, checkpoint,
 classTag,
 coalesce,
 coalesce,
 cogroup,
 cogroup,
 cogroup,
 co
 group, cogroup,
 cogroup,
 cogroup,
 cogroup,
 cogroup, collect,
 collectAsMap,
 collectAsync,
 collectPartitions,
 combineByKey,
 combineByKey,
 combineByKey,
 combineByKey,
 context,
 count,
 countApprox,
 countApprox,
 countApproxDistinct,
 countApproxDistinctByKey,
 countApproxDistinctByKey,
 countApproxDistinctByKey,
 countAsync,
 countByKey,
 countByKeyApprox,
 countByKeyApprox,
 countByValue,
 countByValueApprox, countByValueApprox,
 distinct,
 distinct,
 filter,
 first,
 flatMap,
 flatMapToDouble,
 flatMapToPair, flatMapValues,
 fold,
 foldByKey,
 foldByKey,
 foldByKey,
 foreach,
 foreachAsync,
 foreachPartition,
 foreachPartitionAsync,
 fromJavaRDD,
 fromRDD,
 fullOuterJoin,
 fullOuterJoin,
 fullOuterJoin,
 getCheckpointFile,
 getNumPartitions,
 getStorageLevel,
 glom,
 groupBy,
 groupBy,
 groupByKey,
 groupByKey, groupByKey,
 groupWith,
 groupWith,
 groupWith,
 id, 
intersection,
 isCheckpointed<
 /a>, isEmpty,
 iterator,
 join,
 join,
 join,
 keyBy,
 keys,
 leftOuterJoin,
  leftOuterJoin,
 leftOuterJoin,
 lookup,
 map,
 mapPartitions,
 mapPartitions,
 mapPa
 rtitionsToDouble, mapPartitionsToDouble,
 mapPartitionsToPair,
 mapPartitionsToPair,
 mapPartitionsWithIndex,
 mapPartitionsWithIndex$default$2,
 mapToDouble, href="../../../../../org/apache/spark/api/java/JavaPairRDD.html#mapToPair-org.apache.spark.api.java.function.PairFunction-">mapToPair,
 > href="../../../../../org/apache/spark/api/java/JavaPairRDD.html#mapValues-org.apache.spark.api.java.function.Function-">mapValues,
 > href="../../../../../org/apache/spark/api/java/JavaPairRDD.html#max-java.util.Comparator-">max,
 > href="../../../../../org/apache/spark/api/java/JavaPairRDD.html#min-java.util.Comparator-">min,
 > href="../../../../../org/apache/spark/api/java/JavaPairRDD.html#name--">name,
 > href="../../../../../org/apache/spark/api/java/JavaPairRDD.html#partitionBy-org.apache.spark.Partitioner-">partitionBy,
 > href="../../

[07/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkContext.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkContext.html 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkContext.html
new file mode 100644
index 000..7d037c3
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkContext.html
@@ -0,0 +1,2389 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+JavaSparkContext (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":42,"i1":42,"i2":42,"i3":42,"i4":42,"i5":42,"i6":42,"i7":42,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10,"i17":10,"i18":10,"i19":10,"i20":10,"i21":10,"i22":10,"i23":42,"i24":42,"i25":10,"i26":9,"i27":10,"i28":10,"i29":10,"i30":10,"i31":10,"i32":10,"i33":10,"i34":10,"i35":10,"i36":10,"i37":42,"i38":42,"i39":10,"i40":9,"i41":9,"i42":10,"i43":10,"i44":10,"i45":10,"i46":10,"i47":10,"i48":10,"i49":10,"i50":10,"i51":10,"i52":10,"i53":10,"i54":10,"i55":10,"i56":10,"i57":10,"i58":10,"i59":10,"i60":10,"i61":10,"i62":10,"i63":10,"i64":10,"i65":10,"i66":10,"i67":10,"i68":10,"i69":10,"i70":9,"i71":10,"i72":10,"i73":10,"i74":10,"i75":10,"i76":10,"i77":10,"i78":10,"i79":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete 
Methods"],32:["t6","Deprecated Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java
+Class JavaSparkContext
+
+
+
+Object
+
+
+org.apache.spark.api.java.JavaSparkContext
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Closeable, AutoCloseable
+
+
+
+public class JavaSparkContext
+extends Object
+implements java.io.Closeable
+A Java-friendly version of SparkContext that returns
+ JavaRDDs and works with Java 
collections instead of Scala ones.
+ 
+ Only one SparkContext may be active per JVM.  You must stop() 
the active SparkContext before
+ creating a new one.  This limitation may eventually be removed; see 
SPARK-2243 for more details.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+JavaSparkContext()
+Create a JavaSparkContext that loads settings from system 
properties (for instance, when
+ launching with ./bin/spark-submit).
+
+
+
+JavaSparkContext(SparkConf conf) 
+
+
+JavaSparkContext(SparkContext sc) 
+
+
+JavaSparkContext(String master,
+String appName) 
+
+
+JavaSparkContext(String master,
+String appName,
+SparkConf conf) 
+
+
+JavaSparkContext(String master,
+String appName,
+String sparkHome,
+String jarFile) 
+
+
+JavaSparkContext(String master,
+String appName,
+String sparkHome,
+String[] jars) 
+
+
+JavaSparkContext(String master,
+String appName,
+String sparkHome,
+String[] jars,
+
java.util.Map environment) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Concrete Methods Deprecated Methods 
+
+Modifier and Type
+Method and Description
+
+
+ Accumulable
+accumulable(T initialValue,
+   AccumulableParam param)
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+
+ Accumulable
+accumulable(T initialValue,
+   String name,
+   AccumulableParam param)
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+
+Accumulator
+accumulator(double initialValue)
+Deprecated. 
+use 
sc().doubleAccumulator(). Since 2.0.0.
+
+
+
+
+Accumulator
+accumulator(double initialValue,
+   String name)
+Deprecated. 
+use 
sc().doubleAccumulator(String). Since 2.0.0.
+
+
+
+
+Accumulator
+accumulator(int initialValue)
+Deprecated. 
+use 
sc().longAccumulator(). Since 2.0.0.
+
+
+
+
+Accumulator
+accumulator(int initialValue,
+   String name)
+Deprecated. 
+use 
sc().longAccumulator(String). Since 2.0.0.
+
+
+
+
+<

[05/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/CoGroupFunction.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/CoGroupFunction.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/CoGroupFunction.html
new file mode 100644
index 000..27c2d1e
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/CoGroupFunction.html
@@ -0,0 +1,242 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+CoGroupFunction (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java.function
+Interface 
CoGroupFunction
+
+
+
+
+
+
+All Superinterfaces:
+java.io.Serializable
+
+
+Functional Interface:
+This is a functional interface and can therefore be used as the assignment 
target for a lambda expression or method reference.
+
+
+
+@FunctionalInterface
+public interface CoGroupFunction
+extends java.io.Serializable
+A function that returns zero or more output records from 
each grouping key and its values from 2
+ Datasets.
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Abstract Methods 
+
+Modifier and Type
+Method and Description
+
+
+java.util.Iterator
+call(K key,
+java.util.Iterator left,
+java.util.Iterator right) 
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+
+
+call
+java.util.Iterator call(K key,
+   java.util.Iterator left,
+   java.util.Iterator right)
+throws Exception
+
+Throws:
+Exception
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/DoubleFlatMapFunction.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/DoubleFlatMapFunction.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/DoubleFlatMapFunction.html
new file mode 100644
index 000..fa64168
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/DoubleFlatMapFunction.html
@@ -0,0 +1,237 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+DoubleFlatMapFunction (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark.api.java.function
+Interface 
DoubleFlatMapFunction
+
+
+

[17/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SparkFirehoseListener.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/SparkFirehoseListener.html 
b/site/docs/2.3.2/api/java/org/apache/spark/SparkFirehoseListener.html
new file mode 100644
index 000..59eadc2
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/SparkFirehoseListener.html
@@ -0,0 +1,575 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+SparkFirehoseListener (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10,"i17":10,"i18":10,"i19":10,"i20":10,"i21":10,"i22":10,"i23":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class 
SparkFirehoseListener
+
+
+
+Object
+
+
+org.apache.spark.SparkFirehoseListener
+
+
+
+
+
+
+
+
+public class SparkFirehoseListener
+extends Object
+Class that allows users to receive all SparkListener events.
+ Users should override the onEvent method.
+
+ This is a concrete Java class in order to ensure that we don't forget to 
update it when adding
+ new methods to SparkListener: forgetting to add a method will result in a 
compilation error (if
+ this was a concrete Scala class, default implementations of new event 
handlers would be inherited
+ from the SparkListener trait).
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+SparkFirehoseListener() 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+void
+onApplicationEnd(SparkListenerApplicationEnd applicationEnd) 
+
+
+void
+onApplicationStart(SparkListenerApplicationStart applicationStart) 
+
+
+void
+onBlockManagerAdded(SparkListenerBlockManagerAdded blockManagerAdded) 
+
+
+void
+onBlockManagerRemoved(SparkListenerBlockManagerRemoved blockManagerRemoved) 
+
+
+void
+onBlockUpdated(SparkListenerBlockUpdated blockUpdated) 
+
+
+void
+onEnvironmentUpdate(SparkListenerEnvironmentUpdate environmentUpdate) 
+
+
+void
+onEvent(SparkListenerEvent event) 
+
+
+void
+onExecutorAdded(SparkListenerExecutorAdded executorAdded) 
+
+
+void
+onExecutorBlacklisted(SparkListenerExecutorBlacklisted executorBlacklisted) 
+
+
+void
+onExecutorMetricsUpdate(SparkListenerExecutorMetricsUpdate executorMetricsUpdate) 
+
+
+void
+onExecutorRemoved(SparkListenerExecutorRemoved executorRemoved) 
+
+
+void
+onExecutorUnblacklisted(SparkListenerExecutorUnblacklisted executorUnblacklisted) 
+
+
+void
+onJobEnd(SparkListenerJobEnd jobEnd) 
+
+
+void
+onJobStart(SparkListenerJobStart jobStart) 
+
+
+void
+onNodeBlacklisted(SparkListenerNodeBlacklisted nodeBlacklisted) 
+
+
+void
+onNodeUnblacklisted(SparkListenerNodeUnblacklisted nodeUnblacklisted) 
+
+
+void
+onOtherEvent(SparkListenerEvent event) 
+
+
+void
+onSpeculativeTaskSubmitted(SparkListenerSpeculativeTaskSubmitted speculativeTask) 
+
+
+void
+onStageCompleted(SparkListenerStageCompleted stageCompleted) 
+
+
+void
+onStageSubmitted(SparkListenerStageSubmitted stageSubmitted) 
+
+
+void
+onTaskEnd(SparkListenerTaskEnd taskEnd) 
+
+
+void
+onTaskGettingResult(SparkListenerTaskGettingResult taskGettingResult) 
+
+
+void
+onTaskStart(SparkListenerTaskStart taskStart) 
+
+
+void
+onUnpersistRDD(SparkListenerUnpersistRDD unpersistRDD) 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+SparkFirehoseListener
+public SparkFirehoseListener()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+onEvent
+public void onEvent(SparkListenerEvent event)
+
+
+
+
+
+
+
+onStageCompleted
+public final void onStageCompleted(SparkListenerStageCompleted stageCompleted)
+
+
+
+
+
+
+
+onStageSubmitted
+public final void onStageSubmitted(SparkListenerStageSubmitted stageSubm

[18/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SparkEnv.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/SparkEnv.html 
b/site/docs/2.3.2/api/java/org/apache/spark/SparkEnv.html
new file mode 100644
index 000..152b422
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/SparkEnv.html
@@ -0,0 +1,504 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+SparkEnv (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":9,"i6":10,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":9,"i14":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class SparkEnv
+
+
+
+Object
+
+
+org.apache.spark.SparkEnv
+
+
+
+
+
+
+
+All Implemented Interfaces:
+Logging
+
+
+
+public class SparkEnv
+extends Object
+implements Logging
+:: DeveloperApi ::
+ Holds all the runtime environment objects for a running Spark instance 
(either master or worker),
+ including the serializer, RpcEnv, block manager, map output tracker, etc. 
Currently
+ Spark code finds the SparkEnv through a global variable, so all the threads 
can access the same
+ SparkEnv. It can be accessed by SparkEnv.get (e.g. after creating a 
SparkContext).
+ 
+ NOTE: This is not intended for external use. This is exposed for Shark and 
may be made private
+   in a future release.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+SparkEnv(String executorId,
+org.apache.spark.rpc.RpcEnv rpcEnv,
+Serializer serializer,
+Serializer closureSerializer,
+org.apache.spark.serializer.SerializerManager serializerManager,
+org.apache.spark.MapOutputTracker mapOutputTracker,
+org.apache.spark.shuffle.ShuffleManager shuffleManager,
+org.apache.spark.broadcast.BroadcastManager broadcastManager,
+org.apache.spark.storage.BlockManager blockManager,
+org.apache.spark.SecurityManager securityManager,
+org.apache.spark.metrics.MetricsSystem metricsSystem,
+org.apache.spark.memory.MemoryManager memoryManager,
+
org.apache.spark.scheduler.OutputCommitCoordinator outputCommitCoordinator,
+SparkConf conf) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+org.apache.spark.storage.BlockManager
+blockManager() 
+
+
+org.apache.spark.broadcast.BroadcastManager
+broadcastManager() 
+
+
+Serializer
+closureSerializer() 
+
+
+SparkConf
+conf() 
+
+
+String
+executorId() 
+
+
+static SparkEnv
+get()
+Returns the SparkEnv.
+
+
+
+org.apache.spark.MapOutputTracker
+mapOutputTracker() 
+
+
+org.apache.spark.memory.MemoryManager
+memoryManager() 
+
+
+org.apache.spark.metrics.MetricsSystem
+metricsSystem() 
+
+
+org.apache.spark.scheduler.OutputCommitCoordinator
+outputCommitCoordinator() 
+
+
+org.apache.spark.SecurityManager
+securityManager() 
+
+
+Serializer
+serializer() 
+
+
+org.apache.spark.serializer.SerializerManager
+serializerManager() 
+
+
+static void
+set(SparkEnv e) 
+
+
+org.apache.spark.shuffle.ShuffleManager
+shuffleManager() 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface org.apache.spark.internal.Logging
+initializeLogging,
 initializeLogIfNecessary,
 initializeLogIfNecessary,
 isTraceEnabled,
 log_, log, logDebug,
 logDebug,
 logError,
 logError, logInfo,
 logInfo,
 logName, 
logTrace,
 logTrace,
 logWarning,
 logWarning
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+SparkEnv
+public SparkEnv(String executorId,
+org.apache.spark.rpc.RpcEnv rpcEnv,
+Serializer serializer,
+Serializer closureSerializer,
+
org.apache.spark.serializer.SerializerManager serializerManager,
+org.a

[38/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/spark.survreg.html
--
diff --git a/site/docs/2.3.2/api/R/spark.survreg.html 
b/site/docs/2.3.2/api/R/spark.survreg.html
new file mode 100644
index 000..dbbe947
--- /dev/null
+++ b/site/docs/2.3.2/api/R/spark.survreg.html
@@ -0,0 +1,156 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Accelerated Failure Time 
(AFT) Survival Regression Model
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+spark.survreg {SparkR}R 
Documentation
+
+Accelerated Failure Time (AFT) Survival Regression Model
+
+Description
+
+spark.survreg fits an accelerated failure time (AFT) survival 
regression model on
+a SparkDataFrame. Users can call summary to get a summary of the 
fitted AFT model,
+predict to make predictions on new data, and 
write.ml/read.ml to
+save/load fitted models.
+
+
+
+Usage
+
+
+spark.survreg(data, formula, ...)
+
+## S4 method for signature 'SparkDataFrame,formula'
+spark.survreg(data, formula,
+  aggregationDepth = 2, stringIndexerOrderType = c("frequencyDesc",
+  "frequencyAsc", "alphabetDesc", "alphabetAsc"))
+
+## S4 method for signature 'AFTSurvivalRegressionModel'
+summary(object)
+
+## S4 method for signature 'AFTSurvivalRegressionModel'
+predict(object, newData)
+
+## S4 method for signature 'AFTSurvivalRegressionModel,character'
+write.ml(object, path,
+  overwrite = FALSE)
+
+
+
+Arguments
+
+
+data
+
+a SparkDataFrame for training.
+
+formula
+
+a symbolic description of the model to be fitted. Currently only a few 
formula
+operators are supported, including '~', ':', '+', and '-'.
+Note that operator '.' is not supported currently.
+
+...
+
+additional arguments passed to the method.
+
+aggregationDepth
+
+The depth for treeAggregate (greater than or equal to 2). If the
+dimensions of features or the number of partitions are large, this
+param could be adjusted to a larger size. This is an expert parameter.
+Default value should be good for most cases.
+
+stringIndexerOrderType
+
+how to order categories of a string feature column. This is used to
+decide the base level of a string feature as the last category
+after ordering is dropped when encoding strings. Supported options
+are "frequencyDesc", "frequencyAsc", 
"alphabetDesc", and
+"alphabetAsc". The default value is "frequencyDesc". When 
the
+ordering is set to "alphabetDesc", this drops the same category
+as R when encoding strings.
+
+object
+
+a fitted AFT survival regression model.
+
+newData
+
+a SparkDataFrame for testing.
+
+path
+
+the directory where the model is saved.
+
+overwrite
+
+overwrites or not if the output path already exists. Default is FALSE
+which means throw exception if the output path exists.
+
+
+
+
+Value
+
+spark.survreg returns a fitted AFT survival regression model.
+
+summary returns summary information of the fitted model, which 
is a list.
+The list includes the model's coefficients (features, 
coefficients,
+intercept and log(scale)).
+
+predict returns a SparkDataFrame containing predicted values
+on the original scale of the data (mean predicted value at scale = 1.0).
+
+
+
+Note
+
+spark.survreg since 2.0.0
+
+summary(AFTSurvivalRegressionModel) since 2.0.0
+
+predict(AFTSurvivalRegressionModel) since 2.0.0
+
+write.ml(AFTSurvivalRegressionModel, character) since 2.0.0
+
+
+
+See Also
+
+survival: https://cran.r-project.org/package=survival";>https://cran.r-project.org/package=survival
+
+write.ml
+
+
+
+Examples
+
+## Not run: 
+##D df <- createDataFrame(ovarian)
+##D model <- spark.survreg(df, Surv(futime, fustat) ~ ecog_ps + rx)
+##D 
+##D # get a summary of the model
+##D summary(model)
+##D 
+##D # make predictions
+##D predicted <- predict(model, df)
+##D showDF(predicted)
+##D 
+##D # save and load the model
+##D path <- "path/to/model"
+##D write.ml(model, path)
+##D savedModel <- read.ml(path)
+##D summary(savedModel)
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/spark.svmLinear.html
--
diff --git a/site/docs/2.3.2/api/R/spark.svmLinear.html 
b/site/docs/2.3.2/api/R/spark.svmLinear.html
new file mode 100644
index 000..04b0674
--- /dev/null
+++ b/site/docs/2.3.2/api/R/spark.svmLinear.html
@@ -0,0 +1,177 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Linear SVM Model
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://

[23/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.output$.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.output$.html 
b/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.output$.html
new file mode 100644
index 000..3837c3b
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.output$.html
@@ -0,0 +1,325 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+InternalAccumulator.output$ (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class 
InternalAccumulator.output$
+
+
+
+Object
+
+
+org.apache.spark.InternalAccumulator.output$
+
+
+
+
+
+
+
+Enclosing class:
+InternalAccumulator
+
+
+
+public static class InternalAccumulator.output$
+extends Object
+
+
+
+
+
+
+
+
+
+
+
+Field Summary
+
+Fields 
+
+Modifier and Type
+Field and Description
+
+
+static InternalAccumulator.output$
+MODULE$
+Static reference to the singleton instance of this Scala 
object.
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+output$() 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+String
+BYTES_WRITTEN() 
+
+
+String
+RECORDS_WRITTEN() 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Field Detail
+
+
+
+
+
+MODULE$
+public static final InternalAccumulator.output$ MODULE$
+Static reference to the singleton instance of this Scala 
object.
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+output$
+public output$()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+BYTES_WRITTEN
+public String BYTES_WRITTEN()
+
+
+
+
+
+
+
+RECORDS_WRITTEN
+public String RECORDS_WRITTEN()
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.shuffleRead$.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.shuffleRead$.html
 
b/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.shuffleRead$.html
new file mode 100644
index 000..0861499
--- /dev/null
+++ 
b/site/docs/2.3.2/api/java/org/apache/spark/InternalAccumulator.shuffleRead$.html
@@ -0,0 +1,390 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+InternalAccumulator.shuffleRead$ (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+var methods = {"i0":5,"i1":9,"i2":5,"i3":5,"i4":5,"i5":9,"i6":9,"i7":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],4:["t3","Abstract Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class Resubmitted
+
+
+
+Object
+
+
+org.apache.spark.Resubmitted
+
+
+
+
+
+
+
+
+public class Resubmitted
+extends Object
+:: DeveloperApi ::
+ A org.apache.spark.scheduler.ShuffleMapTask that completed 
successfully earlier, but we
+ lost the executor before the stage completed. This means Spark needs to 
reschedule the task
+ to be re-executed on a different executor.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+Resubmitted() 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Abstract Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+abstract static boolean
+canEqual(Object that) 
+
+
+static boolean
+countTowardsTaskFailures() 
+
+
+abstract static boolean
+equals(Object that) 
+
+
+abstract static int
+productArity() 
+
+
+abstract static Object
+productElement(int n) 
+
+
+static 
scala.collection.Iterator
+productIterator() 
+
+
+static String
+productPrefix() 
+
+
+static String
+toErrorString() 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+Resubmitted
+public Resubmitted()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+toErrorString
+public static String toErrorString()
+
+
+
+
+
+
+
+countTowardsTaskFailures
+public static boolean countTowardsTaskFailures()
+
+
+
+
+
+
+
+canEqual
+public abstract static boolean canEqual(Object that)
+
+
+
+
+
+
+
+equals
+public abstract static boolean equals(Object that)
+
+
+
+
+
+
+
+productElement
+public abstract static Object productElement(int n)
+
+
+
+
+
+
+
+productArity
+public abstract static int productArity()
+
+
+
+
+
+
+
+productIterator
+public 
static scala.collection.Iterator productIterator()
+
+
+
+
+
+
+
+productPrefix
+public static String productPrefix()
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SerializableWritable.html
--
diff --git 
a/site/docs/2.3.2/api/java/org/apache/spark/SerializableWritable.html 
b/site/docs/2.3.2/api/java/org/apache/spark/SerializableWritable.html
new file mode 100644
index 000..7639406
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/SerializableWritable.html
@@ -0,0 +1,310 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+SerializableWritable (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","

[29/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/Accumulable.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/Accumulable.html 
b/site/docs/2.3.2/api/java/org/apache/spark/Accumulable.html
new file mode 100644
index 000..813e8c2
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/Accumulable.html
@@ -0,0 +1,489 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+Accumulable (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":42,"i1":42,"i2":42,"i3":42,"i4":42,"i5":42,"i6":42,"i7":42,"i8":42};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"],32:["t6","Deprecated Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class Accumulable
+
+
+
+Object
+
+
+org.apache.spark.Accumulable
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable
+
+
+Direct Known Subclasses:
+Accumulator
+
+
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+public class Accumulable
+extends Object
+implements java.io.Serializable
+A data type that can be accumulated, i.e. has a commutative 
and associative "add" operation,
+ but where the result type, R, may be different from the element 
type being added, T.
+ 
+ You must define how to add data, and how to merge two of these together.  For 
some data types,
+ such as a counter, these might be the same operation. In that case, you can 
use the simpler
+ Accumulator. They won't always be the same, 
though -- e.g., imagine you are
+ accumulating a set. You will add items to the set, and you will union two 
sets together.
+ 
+ Operations are not thread-safe.
+ 
+ param:  id ID of this accumulator; for internal use only.
+ param:  initialValue initial value of accumulator
+ param:  param helper object defining how to add elements of type 
R and T
+ param:  name human-readable name for use in Spark's web UI
+ param:  countFailedValues whether to accumulate values from failed tasks. 
This is set to true
+  for system and time metrics like serialization time 
or bytes spilled,
+  and false for things with absolute values like 
number of input rows.
+  This should be used for internal metrics only.
+
+See Also:
+Serialized 
Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+Accumulable(R initialValue,
+   AccumulableParam param)
+Deprecated. 
+ 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Concrete Methods Deprecated Methods 
+
+Modifier and Type
+Method and Description
+
+
+void
+add(T term)
+Deprecated. 
+Add more data to this accumulator / accumulable
+
+
+
+long
+id()
+Deprecated. 
+ 
+
+
+R
+localValue()
+Deprecated. 
+Get the current value of this accumulator from within a 
task.
+
+
+
+void
+merge(R term)
+Deprecated. 
+Merge two accumulable objects together
+
+
+
+scala.Option
+name()
+Deprecated. 
+ 
+
+
+void
+setValue(R newValue)
+Deprecated. 
+Set the accumulator's value.
+
+
+
+String
+toString()
+Deprecated. 
+ 
+
+
+R
+value()
+Deprecated. 
+Access the accumulator's current value; only allowed on 
driver.
+
+
+
+R
+zero()
+Deprecated. 
+ 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+
+
+Accumulable
+public Accumulable(R initialValue,
+   AccumulableParam param)
+Deprecated. 
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+id
+public long id()
+Deprecated. 
+
+
+
+
+
+
+
+name
+public scala.Option name()
+Deprecated. 
+
+
+
+
+
+
+
+zero
+public R zero()
+Deprecated. 
+
+
+
+
+
+
+
+
+
+add
+public void add(T term)
+Deprecated. 
+Add more data to this accumulator / accumulable
+
+Parameters:
+term - the data to add
+
+
+
+
+
+
+
+
+
+
+merge
+public void merge(R term)
+Deprecated. 
+Merge two accumulable objects together
+ 
+ Normally, a user will not want to use this version, but will instead ca

[39/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/spark.gbt.html
--
diff --git a/site/docs/2.3.2/api/R/spark.gbt.html 
b/site/docs/2.3.2/api/R/spark.gbt.html
new file mode 100644
index 000..db3b126
--- /dev/null
+++ b/site/docs/2.3.2/api/R/spark.gbt.html
@@ -0,0 +1,257 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Gradient Boosted Tree 
Model for Regression and Classification
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+spark.gbt 
{SparkR}R Documentation
+
+Gradient Boosted Tree Model for Regression and Classification
+
+Description
+
+spark.gbt fits a Gradient Boosted Tree Regression model or 
Classification model on a
+SparkDataFrame. Users can call summary to get a summary of the 
fitted
+Gradient Boosted Tree model, predict to make predictions on new 
data, and
+write.ml/read.ml to save/load fitted models.
+For more details, see
+http://spark.apache.org/docs/latest/ml-classification-regression.html#gradient-boosted-tree-regression";>
+GBT Regression and
+http://spark.apache.org/docs/latest/ml-classification-regression.html#gradient-boosted-tree-classifier";>
+GBT Classification
+
+
+
+Usage
+
+
+spark.gbt(data, formula, ...)
+
+## S4 method for signature 'SparkDataFrame,formula'
+spark.gbt(data, formula,
+  type = c("regression", "classification"), maxDepth = 5,
+  maxBins = 32, maxIter = 20, stepSize = 0.1, lossType = NULL,
+  seed = NULL, subsamplingRate = 1, minInstancesPerNode = 1,
+  minInfoGain = 0, checkpointInterval = 10, maxMemoryInMB = 256,
+  cacheNodeIds = FALSE, handleInvalid = c("error", "keep", "skip"))
+
+## S4 method for signature 'GBTRegressionModel'
+summary(object)
+
+## S3 method for class 'summary.GBTRegressionModel'
+print(x, ...)
+
+## S4 method for signature 'GBTClassificationModel'
+summary(object)
+
+## S3 method for class 'summary.GBTClassificationModel'
+print(x, ...)
+
+## S4 method for signature 'GBTRegressionModel'
+predict(object, newData)
+
+## S4 method for signature 'GBTClassificationModel'
+predict(object, newData)
+
+## S4 method for signature 'GBTRegressionModel,character'
+write.ml(object, path,
+  overwrite = FALSE)
+
+## S4 method for signature 'GBTClassificationModel,character'
+write.ml(object, path,
+  overwrite = FALSE)
+
+
+
+Arguments
+
+
+data
+
+a SparkDataFrame for training.
+
+formula
+
+a symbolic description of the model to be fitted. Currently only a few 
formula
+operators are supported, including '~', ':', '+', and '-'.
+
+...
+
+additional arguments passed to the method.
+
+type
+
+type of model, one of "regression" or "classification", 
to fit
+
+maxDepth
+
+Maximum depth of the tree (>= 0).
+
+maxBins
+
+Maximum number of bins used for discretizing continuous features and for 
choosing
+how to split on features at each node. More bins give higher granularity. Must 
be
+>= 2 and >= number of categories in any categorical feature.
+
+maxIter
+
+Param for maximum number of iterations (>= 0).
+
+stepSize
+
+Param for Step size to be used for each iteration of optimization.
+
+lossType
+
+Loss function which GBT tries to minimize.
+For classification, must be "logistic". For regression, must be one 
of
+"squared" (L2) and "absolute" (L1), default is 
"squared".
+
+seed
+
+integer seed for random number generation.
+
+subsamplingRate
+
+Fraction of the training data used for learning each decision tree, in
+range (0, 1].
+
+minInstancesPerNode
+
+Minimum number of instances each child must have after split. If a
+split causes the left or right child to have fewer than
+minInstancesPerNode, the split will be discarded as invalid. Should be
+>= 1.
+
+minInfoGain
+
+Minimum information gain for a split to be considered at a tree node.
+
+checkpointInterval
+
+Param for set checkpoint interval (>= 1) or disable checkpoint (-1).
+Note: this setting will be ignored if the checkpoint directory is not
+set.
+
+maxMemoryInMB
+
+Maximum memory in MB allocated to histogram aggregation.
+
+cacheNodeIds
+
+If FALSE, the algorithm will pass trees to executors to match instances with
+nodes. If TRUE, the algorithm will cache node IDs for each instance. Caching
+can speed up training of deeper trees. Users can set how often should the
+cache be checkpointed or disable it by setting checkpointInterval.
+
+handleInvalid
+
+How to handle invalid data (unseen labels or NULL values) in features and
+label column of string type in classification model.
+Supported options: "skip" (filter out rows with invalid data),
+"error" (throw an error), "keep" (put invalid data in
+a special additional bucket, at index numLabels). Default
+is "error".
+
+object
+
+A fitted Gradient Boosted T

[24/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/FetchFailed.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/FetchFailed.html 
b/site/docs/2.3.2/api/java/org/apache/spark/FetchFailed.html
new file mode 100644
index 000..263284c
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/FetchFailed.html
@@ -0,0 +1,483 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+FetchFailed (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":5,"i2":10,"i3":5,"i4":10,"i5":10,"i6":5,"i7":5,"i8":9,"i9":9,"i10":10,"i11":10,"i12":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],4:["t3","Abstract 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class FetchFailed
+
+
+
+Object
+
+
+org.apache.spark.FetchFailed
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, TaskEndReason, TaskFailedReason, scala.Equals, scala.Product
+
+
+
+public class FetchFailed
+extends Object
+implements TaskFailedReason, scala.Product, 
scala.Serializable
+:: DeveloperApi ::
+ Task failed to fetch shuffle data from a remote node. Probably means we have 
lost the remote
+ executors the task is trying to fetch from, and thus need to rerun the 
previous stage.
+
+See Also:
+Serialized 
Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+FetchFailed(BlockManagerId bmAddress,
+   int shuffleId,
+   int mapId,
+   int reduceId,
+   String message) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Abstract Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+BlockManagerId
+bmAddress() 
+
+
+abstract static boolean
+canEqual(Object that) 
+
+
+boolean
+countTowardsTaskFailures()
+Fetch failures lead to a different failure handling path: 
(1) we don't abort the stage after
+ 4 task failures, instead we immediately go back to the stage which generated 
the map output,
+ and regenerate the missing data.
+
+
+
+abstract static boolean
+equals(Object that) 
+
+
+int
+mapId() 
+
+
+String
+message() 
+
+
+abstract static int
+productArity() 
+
+
+abstract static Object
+productElement(int n) 
+
+
+static 
scala.collection.Iterator
+productIterator() 
+
+
+static String
+productPrefix() 
+
+
+int
+reduceId() 
+
+
+int
+shuffleId() 
+
+
+String
+toErrorString()
+Error message displayed in the web UI.
+
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface scala.Product
+productArity, productElement, productIterator, productPrefix
+
+
+
+
+
+Methods inherited from interface scala.Equals
+canEqual, equals
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+FetchFailed
+public FetchFailed(BlockManagerId bmAddress,
+   int shuffleId,
+   int mapId,
+   int reduceId,
+   String message)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+canEqual
+public abstract static boolean canEqual(Object that)
+
+
+
+
+
+
+
+equals
+public abstract static boolean equals(Object that)
+
+
+
+
+
+
+
+productElement
+public abstract static Object productElement(int n)
+
+
+
+
+
+
+
+productArity
+public abstract static int productArity()
+
+
+
+
+
+
+
+productIterator
+public 
static scala.collection.Iterator productIterator()
+
+
+
+
+
+
+
+productPrefix
+public static String productPrefix()
+
+
+
+
+
+
+
+bmAddress
+public BlockManagerId bmAddress()
+
+
+
+
+
+
+
+shuffleId
+public int shuffleId()
+
+
+
+
+
+
+
+mapId
+public int mapId()
+
+
+
+
+
+
+
+reduceId
+public int reduceId()
+
+
+
+
+
+
+
+message
+public String message()
+
+
+
+
+
+
+
+toErrorString
+public String toErrorString()
+Description copied from 
interface: TaskFailedReason
+Error message displayed in the web UI.
+
+Specified by:
+toErrorString in
 interface TaskFailedReason
+
+
+
+
+
+
+
+


[22/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/NarrowDependency.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/NarrowDependency.html 
b/site/docs/2.3.2/api/java/org/apache/spark/NarrowDependency.html
new file mode 100644
index 000..968e594
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/NarrowDependency.html
@@ -0,0 +1,315 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+NarrowDependency (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":6,"i1":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class 
NarrowDependency
+
+
+
+Object
+
+
+org.apache.spark.Dependency
+
+
+org.apache.spark.NarrowDependency
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable
+
+
+Direct Known Subclasses:
+OneToOneDependency, RangeDependency
+
+
+
+public abstract class NarrowDependency
+extends Dependency
+:: DeveloperApi ::
+ Base class for dependencies where each partition of the child RDD depends on 
a small number
+ of partitions of the parent RDD. Narrow dependencies allow for pipelined 
execution.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+NarrowDependency(RDD _rdd) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Instance Methods Abstract Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+abstract 
scala.collection.Seq
+getParents(int partitionId)
+Get the parent partitions for a child partition.
+
+
+
+RDD
+rdd() 
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+NarrowDependency
+public NarrowDependency(RDD _rdd)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getParents
+public 
abstract scala.collection.Seq getParents(int partitionId)
+Get the parent partitions for a child partition.
+
+Parameters:
+partitionId - a partition of the child RDD
+Returns:
+the partitions of the parent RDD that the child partition depends upon
+
+
+
+
+
+
+
+
+rdd
+public RDD rdd()
+
+Specified by:
+rdd in 
class Dependency
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/OneToOneDependency.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/OneToOneDependency.html 
b/site/docs/2.3.2/api/java/org/apache/spark/OneToOneDependency.html
new file mode 100644
index 000..f1ac707
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/OneToOneDependency.html
@@ -0,0 +1,308 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+OneToOneDependency (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecate

[32/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/index-all.html
--
diff --git a/site/docs/2.3.2/api/java/index-all.html 
b/site/docs/2.3.2/api/java/index-all.html
new file mode 100644
index 000..0782b81
--- /dev/null
+++ b/site/docs/2.3.2/api/java/index-all.html
@@ -0,0 +1,51649 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+Index (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+
+
+$ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z _ 
+
+
+$
+
+$colon$bslash(B,
 Function2) - Static method in class 
org.apache.spark.sql.types.StructType
+ 
+$colon$plus(B,
 CanBuildFrom) - Static method in class 
org.apache.spark.sql.types.StructType
+ 
+$div$colon(B,
 Function2) - Static method in class 
org.apache.spark.sql.types.StructType
+ 
+$greater(A)
 - Static method in class org.apache.spark.sql.types.Decimal
+ 
+$greater(A)
 - Static method in class org.apache.spark.storage.RDDInfo
+ 
+$greater$eq(A)
 - Static method in class org.apache.spark.sql.types.Decimal
+ 
+$greater$eq(A)
 - Static method in class org.apache.spark.storage.RDDInfo
+ 
+$less(A) 
- Static method in class org.apache.spark.sql.types.Decimal
+ 
+$less(A) - 
Static method in class org.apache.spark.storage.RDDInfo
+ 
+$less$eq(A)
 - Static method in class org.apache.spark.sql.types.Decimal
+ 
+$less$eq(A)
 - Static method in class org.apache.spark.storage.RDDInfo
+ 
+$minus$greater(T)
 - Static method in class org.apache.spark.ml.param.DoubleParam
+ 
+$minus$greater(T)
 - Static method in class org.apache.spark.ml.param.FloatParam
+ 
+$plus$colon(B,
 CanBuildFrom) - Static method in class 
org.apache.spark.sql.types.StructType
+ 
+$plus$eq(T)
 - Static method in class org.apache.spark.Accumulator
+
+Deprecated.
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.api.r.RRDD
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.graphx.EdgeRDD
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.graphx.impl.EdgeRDDImpl
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.graphx.impl.VertexRDDImpl
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.graphx.VertexRDD
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.rdd.HadoopRDD
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.rdd.JdbcRDD
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.rdd.NewHadoopRDD
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.rdd.PartitionPruningRDD
+ 
+$plus$plus(RDD)
 - Static method in class org.apache.spark.rdd.UnionRDD
+ 
+$plus$plus(GenTraversableOnce,
 CanBuildFrom) - Static method in class 
org.apache.spark.sql.types.StructType
+ 
+$plus$plus$colon(TraversableOnce,
 CanBuildFrom) - Static method in class 
org.apache.spark.sql.types.StructType
+ 
+$plus$plus$colon(Traversable,
 CanBuildFrom) - Static method in class 
org.apache.spark.sql.types.StructType
+ 
+$plus$plus$eq(R)
 - Static method in class org.apache.spark.Accumulator
+
+Deprecated.
+ 
+
+
+
+
+A
+
+abort(WriterCommitMessage[])
 - Method in interface org.apache.spark.sql.sources.v2.writer.DataSourceWriter
+
+Aborts this writing job because some data writers are 
failed and keep failing when retry, or
+ the Spark job fails with some unknown reasons, or DataSourceWriter.commit(WriterCommitMessage[])
 fails.
+
+abort()
 - Method in interface org.apache.spark.sql.sources.v2.writer.DataWriter
+
+Aborts this writer if it is failed.
+
+abort(long,
 WriterCommitMessage[]) - Method in interface 
org.apache.spark.sql.sources.v2.writer.streaming.StreamWriter
+
+Aborts this writing job because some data writers are 
failed and keep failing when retry, or
+ the Spark job fails with some unknown reasons, or StreamWriter.commit(WriterCommitMessage[])
 fails.
+
+abort(WriterCommitMessage[])
 - Method in interface org.apache.spark.sql.sources.v2.writer.streaming.StreamWriter
+ 
+abortJob(JobContext)
 - Method in class org.apache.spark.internal.io.FileCommitProtocol
+
+Aborts a job after the writes fail.
+
+abortJob(JobContext)
 - Method in class org.apache.spark.internal.io.HadoopMapReduceCommitProtocol
+ 
+abortTask(TaskAttemptContext)
 - Method in class org.apache.spark.

[34/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/allclasses-noframe.html
--
diff --git a/site/docs/2.3.2/api/java/allclasses-noframe.html 
b/site/docs/2.3.2/api/java/allclasses-noframe.html
new file mode 100644
index 000..ab206fd
--- /dev/null
+++ b/site/docs/2.3.2/api/java/allclasses-noframe.html
@@ -0,0 +1,1300 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+All Classes (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+All Classes
+
+
+AbsoluteError
+AbstractLauncher
+Accumulable
+AccumulableInfo
+AccumulableInfo
+AccumulableParam
+Accumulator
+AccumulatorContext
+AccumulatorParam
+AccumulatorParam.DoubleAccumulatorParam$
+AccumulatorParam.FloatAccumulatorParam$
+AccumulatorParam.IntAccumulatorParam$
+AccumulatorParam.LongAccumulatorParam$
+AccumulatorParam.StringAccumulatorParam$
+AccumulatorV2
+AFTAggregator
+AFTCostFun
+AFTSurvivalRegression
+AFTSurvivalRegressionModel
+AggregatedDialect
+AggregatingEdgeContext
+Aggregator
+Aggregator
+Algo
+AllJobsCancelled
+AllReceiverIds
+ALS
+ALS
+ALS.InBlock$
+ALS.Rating
+ALS.Rating$
+ALS.RatingBlock$
+ALSModel
+AnalysisException
+And
+AnyDataType
+ApiHelper
+ApplicationAttemptInfo
+ApplicationEnvironmentInfo
+ApplicationInfo
+ApplicationStatus
+ApplyInPlace
+AppStatusUtils
+AreaUnderCurve
+ArrayType
+ArrowColumnVector
+AskPermissionToCommitOutput
+AssociationRules
+AssociationRules
+AssociationRules.Rule
+AsyncEventQueue
+AsyncRDDActions
+Attribute
+AttributeGroup
+AttributeKeys
+AttributeType
+BaseRelation
+BaseRRDD
+BasicBlockReplicationPolicy
+BatchInfo
+BatchInfo
+BatchStatus
+BernoulliCellSampler
+BernoulliSampler
+Binarizer
+BinaryAttribute
+BinaryClassificationEvaluator
+BinaryClassificationMetrics
+BinaryLogisticRegressionSummary
+BinaryLogisticRegressionSummaryImpl
+BinaryLogisticRegressionTrainingSummary
+BinaryLogisticRegressionTrainingSummaryImpl
+BinarySample
+BinaryType
+BinomialBounds
+BisectingKMeans
+BisectingKMeans
+BisectingKMeansModel
+BisectingKMeansModel
+BisectingKMeansModel.SaveLoadV1_0$
+BisectingKMeansSummary
+BlacklistedExecutor
+BLAS
+BLAS
+BlockId
+BlockManagerId
+BlockManagerMessages
+BlockManagerMessages.BlockLocationsAndStatus
+BlockManagerMessages.BlockLocationsAndStatus$
+BlockManagerMessages.BlockManagerHeartbeat
+BlockManagerMessages.BlockManagerHeartbeat$
+BlockManagerMessages.GetBlockStatus
+BlockManagerMessages.GetBlockStatus$
+BlockManagerMessages.GetExecutorEndpointRef
+BlockManagerMessages.GetExecutorEndpointRef$
+BlockManagerMessages.GetLocations
+BlockManagerMessages.GetLocations$
+BlockManagerMessages.GetLocationsAndStatus
+BlockManagerMessages.GetLocationsAndStatus$
+BlockManagerMessages.GetLocationsMultipleBlockIds
+BlockManagerMessages.GetLocationsMultipleBlockIds$
+BlockManagerMessages.GetMatchingBlockIds
+BlockManagerMessages.GetMatchingBlockIds$
+BlockManagerMessages.GetMemoryStatus$
+BlockManagerMessages.GetPeers
+BlockManagerMessages.GetPeers$
+BlockManagerMessages.GetStorageStatus$
+BlockManagerMessages.HasCachedBlocks
+BlockManagerMessages.HasCachedBlocks$
+BlockManagerMessages.RegisterBlockManager
+BlockManagerMessages.RegisterBlockManager$
+BlockManagerMessages.RemoveBlock
+BlockManagerMessages.RemoveBlock$
+BlockManagerMessages.RemoveBroadcast
+BlockManagerMessages.RemoveBroadcast$
+BlockManagerMessages.RemoveExecutor
+BlockManagerMessages.RemoveExecutor$
+BlockManagerMessages.RemoveRdd
+BlockManagerMessages.RemoveRdd$
+BlockManagerMessages.RemoveShuffle
+BlockManagerMessages.RemoveShuffle$
+BlockManagerMessages.ReplicateBlock
+BlockManagerMessages.ReplicateBlock$
+BlockManagerMessages.StopBlockManagerMaster$
+BlockManagerMessages.ToBlockManagerMaster
+BlockManagerMessages.ToBlockManagerSlave
+BlockManagerMessages.TriggerThreadDump$
+BlockManagerMessages.UpdateBlockInfo
+BlockManagerMessages.UpdateBlockInfo$
+BlockMatrix
+BlockNotFoundException
+BlockReplicationPolicy
+BlockReplicationUtils
+BlockStatus
+BlockUpdatedInfo
+BloomFilter
+BloomFilter.Version
+BooleanParam
+BooleanType
+BoostingStrategy
+BoundedDouble
+BreezeUtil
+Broadcast
+BroadcastBlockId
+BucketedRandomProjectionLSH
+BucketedRandomProjectionLSHModel
+Bucketizer
+BufferReleasingInputStream
+BytecodeUtils
+ByteType
+CalendarIntervalType
+Catalog
+CatalystScan
+CategoricalSplit
+CausedBy
+CharType
+CheckpointReader
+CheckpointState
+ChiSqSelector
+ChiSqSelector
+ChiSqSelectorModel
+ChiSqSelectorModel
+ChiSqSelectorModel.SaveLoadV1_0$
+ChiSqTest
+ChiSqTest.Method
+ChiSqTest.Method$
+ChiSqTest.NullHypothesis$
+ChiSqTestResult
+ChiSquareTest
+CholeskyDecomposition
+ClassificationModel
+ClassificationModel
+Classifier
+CleanAccum
+CleanBroadcast
+CleanCheckpoint
+CleanRDD
+CleanShuffle
+CleanupTask
+CleanupTaskWeakReference
+ClosureCleaner
+ClusteredDistribution
+ClusteringEvaluator
+ClusteringSummary
+CoarseGrainedClusterMessages
+CoarseGrainedClusterMessages.AddWebUIFilter
+CoarseGrainedClusterMessages.AddWebUIFilter$
+CoarseGrainedClusterMess

[25/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/ExceptionFailure.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/ExceptionFailure.html 
b/site/docs/2.3.2/api/java/org/apache/spark/ExceptionFailure.html
new file mode 100644
index 000..6f8cbba
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/ExceptionFailure.html
@@ -0,0 +1,502 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+ExceptionFailure (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":5,"i2":10,"i3":9,"i4":10,"i5":5,"i6":10,"i7":10,"i8":5,"i9":5,"i10":9,"i11":9,"i12":10,"i13":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],4:["t3","Abstract 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class ExceptionFailure
+
+
+
+Object
+
+
+org.apache.spark.ExceptionFailure
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, TaskEndReason, TaskFailedReason, scala.Equals, scala.Product
+
+
+
+public class ExceptionFailure
+extends Object
+implements TaskFailedReason, scala.Product, 
scala.Serializable
+:: DeveloperApi ::
+ Task failed due to a runtime exception. This is the most common failure case 
and also captures
+ user program exceptions.
+ 
+ stackTrace contains the stack trace of the exception itself. It 
still exists for backward
+ compatibility. It's better to use this(e: Throwable, metrics: 
Option[TaskMetrics]) to
+ create ExceptionFailure as it will handle the backward 
compatibility properly.
+ 
+ fullStackTrace is a better representation of the stack trace 
because it contains the whole
+ stack trace including the exception and its causes
+ 
+ exception is the actual exception that caused the task to fail. 
It may be None in
+ the case that the exception is not in fact serializable. If a task fails more 
than
+ once (due to retries), exception is that one that caused the 
last failure.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+ExceptionFailure(String className,
+String description,
+StackTraceElement[] stackTrace,
+String fullStackTrace,
+
scala.Option exceptionWrapper,
+scala.collection.Seq accumUpdates,
+scala.collection.Seq> accums) 
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Abstract Methods Concrete Methods 
+
+Modifier and Type
+Method and Description
+
+
+scala.collection.Seq
+accumUpdates() 
+
+
+abstract static boolean
+canEqual(Object that) 
+
+
+String
+className() 
+
+
+static boolean
+countTowardsTaskFailures() 
+
+
+String
+description() 
+
+
+abstract static boolean
+equals(Object that) 
+
+
+scala.Option
+exception() 
+
+
+String
+fullStackTrace() 
+
+
+abstract static int
+productArity() 
+
+
+abstract static Object
+productElement(int n) 
+
+
+static 
scala.collection.Iterator
+productIterator() 
+
+
+static String
+productPrefix() 
+
+
+StackTraceElement[]
+stackTrace() 
+
+
+String
+toErrorString()
+Error message displayed in the web UI.
+
+
+
+
+
+
+
+Methods inherited from class Object
+equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+Methods inherited from interface org.apache.spark.TaskFailedReason
+countTowardsTaskFailures
+
+
+
+
+
+Methods inherited from interface scala.Product
+productArity, productElement, productIterator, productPrefix
+
+
+
+
+
+Methods inherited from interface scala.Equals
+canEqual, equals
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+ExceptionFailure
+public ExceptionFailure(String className,
+String description,
+StackTraceElement[] stackTrace,
+String fullStackTrace,
+
scala.Option

[19/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SparkContext.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/SparkContext.html 
b/site/docs/2.3.2/api/java/org/apache/spark/SparkContext.html
new file mode 100644
index 000..3ffb13a
--- /dev/null
+++ b/site/docs/2.3.2/api/java/org/apache/spark/SparkContext.html
@@ -0,0 +1,3117 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+SparkContext (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+
+var methods = 
{"i0":42,"i1":42,"i2":42,"i3":42,"i4":42,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10,"i10":10,"i11":10,"i12":10,"i13":10,"i14":10,"i15":10,"i16":10,"i17":10,"i18":10,"i19":10,"i20":10,"i21":10,"i22":10,"i23":10,"i24":10,"i25":10,"i26":10,"i27":10,"i28":10,"i29":10,"i30":10,"i31":10,"i32":10,"i33":10,"i34":10,"i35":10,"i36":42,"i37":10,"i38":9,"i39":9,"i40":10,"i41":10,"i42":10,"i43":10,"i44":10,"i45":10,"i46":10,"i47":10,"i48":10,"i49":10,"i50":10,"i51":9,"i52":9,"i53":10,"i54":10,"i55":10,"i56":10,"i57":10,"i58":10,"i59":10,"i60":10,"i61":10,"i62":10,"i63":10,"i64":10,"i65":10,"i66":10,"i67":10,"i68":10,"i69":10,"i70":10,"i71":10,"i72":10,"i73":10,"i74":10,"i75":10,"i76":10,"i77":10,"i78":10,"i79":10,"i80":10,"i81":10,"i82":10,"i83":10,"i84":10,"i85":10,"i86":10,"i87":10,"i88":10,"i89":10,"i90":10,"i91":10,"i92":10,"i93":10,"i94":10,"i95":10,"i96":10,"i97":10,"i98":10,"i99":10,"i100":10,"i101":10,"i102":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete 
Methods"],32:["t6","Deprecated Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev Class
+Next Class
+
+
+Frames
+No Frames
+
+
+All Classes
+
+
+
+
+
+
+
+Summary: 
+Nested | 
+Field | 
+Constr | 
+Method
+
+
+Detail: 
+Field | 
+Constr | 
+Method
+
+
+
+
+
+
+
+
+org.apache.spark
+Class SparkContext
+
+
+
+Object
+
+
+org.apache.spark.SparkContext
+
+
+
+
+
+
+
+All Implemented Interfaces:
+Logging
+
+
+
+public class SparkContext
+extends Object
+implements Logging
+Main entry point for Spark functionality. A SparkContext 
represents the connection to a Spark
+ cluster, and can be used to create RDDs, accumulators and broadcast variables 
on that cluster.
+ 
+ Only one SparkContext may be active per JVM.  You must stop() 
the active SparkContext before
+ creating a new one.  This limitation may eventually be removed; see 
SPARK-2243 for more details.
+ 
+ param:  config a Spark Config object describing the application 
configuration. Any settings in
+   this config overrides the default configs as well as system 
properties.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors 
+
+Constructor and Description
+
+
+SparkContext()
+Create a SparkContext that loads settings from system 
properties (for instance, when
+ launching with ./bin/spark-submit).
+
+
+
+SparkContext(SparkConf config) 
+
+
+SparkContext(String master,
+String appName,
+SparkConf conf)
+Alternative constructor that allows setting common Spark 
properties directly
+
+
+
+SparkContext(String master,
+String appName,
+String sparkHome,
+scala.collection.Seq jars,
+scala.collection.Map environment)
+Alternative constructor that allows setting common Spark 
properties directly
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All Methods Static Methods Instance Methods Concrete Methods Deprecated Methods 
+
+Modifier and Type
+Method and Description
+
+
+ Accumulable
+accumulable(R initialValue,
+   AccumulableParam param)
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+
+ Accumulable
+accumulable(R initialValue,
+   String name,
+   AccumulableParam param)
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+
+ Accumulable
+accumulableCollection(R initialValue,
+ 
scala.Function1> evidence$9,
+ scala.reflect.ClassTag evidence$10)
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+
+ Accumulator
+accumulator(T initialValue,
+   AccumulatorParam param)
+Deprecated. 
+use AccumulatorV2. Since 
2.0.0.
+
+
+
+
+ Accumulator
+accumulator(T ini

[35/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/allclasses-frame.html
--
diff --git a/site/docs/2.3.2/api/java/allclasses-frame.html 
b/site/docs/2.3.2/api/java/allclasses-frame.html
new file mode 100644
index 000..ec0215c
--- /dev/null
+++ b/site/docs/2.3.2/api/java/allclasses-frame.html
@@ -0,0 +1,1300 @@
+http://www.w3.org/TR/html4/loose.dtd";>
+
+
+
+
+All Classes (Spark 2.3.2 JavaDoc)
+
+
+
+
+
+All Classes
+
+
+AbsoluteError
+AbstractLauncher
+Accumulable
+AccumulableInfo
+AccumulableInfo
+AccumulableParam
+Accumulator
+AccumulatorContext
+AccumulatorParam
+AccumulatorParam.DoubleAccumulatorParam$
+AccumulatorParam.FloatAccumulatorParam$
+AccumulatorParam.IntAccumulatorParam$
+AccumulatorParam.LongAccumulatorParam$
+AccumulatorParam.StringAccumulatorParam$
+AccumulatorV2
+AFTAggregator
+AFTCostFun
+AFTSurvivalRegression
+AFTSurvivalRegressionModel
+AggregatedDialect
+AggregatingEdgeContext
+Aggregator
+Aggregator
+Algo
+AllJobsCancelled
+AllReceiverIds
+ALS
+ALS
+ALS.InBlock$
+ALS.Rating
+ALS.Rating$
+ALS.RatingBlock$
+ALSModel
+AnalysisException
+And
+AnyDataType
+ApiHelper
+ApplicationAttemptInfo
+ApplicationEnvironmentInfo
+ApplicationInfo
+ApplicationStatus
+ApplyInPlace
+AppStatusUtils
+AreaUnderCurve
+ArrayType
+ArrowColumnVector
+AskPermissionToCommitOutput
+AssociationRules
+AssociationRules
+AssociationRules.Rule
+AsyncEventQueue
+AsyncRDDActions
+Attribute
+AttributeGroup
+AttributeKeys
+AttributeType
+BaseRelation
+BaseRRDD
+BasicBlockReplicationPolicy
+BatchInfo
+BatchInfo
+BatchStatus
+BernoulliCellSampler
+BernoulliSampler
+Binarizer
+BinaryAttribute
+BinaryClassificationEvaluator
+BinaryClassificationMetrics
+BinaryLogisticRegressionSummary
+BinaryLogisticRegressionSummaryImpl
+BinaryLogisticRegressionTrainingSummary
+BinaryLogisticRegressionTrainingSummaryImpl
+BinarySample
+BinaryType
+BinomialBounds
+BisectingKMeans
+BisectingKMeans
+BisectingKMeansModel
+BisectingKMeansModel
+BisectingKMeansModel.SaveLoadV1_0$
+BisectingKMeansSummary
+BlacklistedExecutor
+BLAS
+BLAS
+BlockId
+BlockManagerId
+BlockManagerMessages
+BlockManagerMessages.BlockLocationsAndStatus
+BlockManagerMessages.BlockLocationsAndStatus$
+BlockManagerMessages.BlockManagerHeartbeat
+BlockManagerMessages.BlockManagerHeartbeat$
+BlockManagerMessages.GetBlockStatus
+BlockManagerMessages.GetBlockStatus$
+BlockManagerMessages.GetExecutorEndpointRef
+BlockManagerMessages.GetExecutorEndpointRef$
+BlockManagerMessages.GetLocations
+BlockManagerMessages.GetLocations$
+BlockManagerMessages.GetLocationsAndStatus
+BlockManagerMessages.GetLocationsAndStatus$
+BlockManagerMessages.GetLocationsMultipleBlockIds
+BlockManagerMessages.GetLocationsMultipleBlockIds$
+BlockManagerMessages.GetMatchingBlockIds
+BlockManagerMessages.GetMatchingBlockIds$
+BlockManagerMessages.GetMemoryStatus$
+BlockManagerMessages.GetPeers
+BlockManagerMessages.GetPeers$
+BlockManagerMessages.GetStorageStatus$
+BlockManagerMessages.HasCachedBlocks
+BlockManagerMessages.HasCachedBlocks$
+BlockManagerMessages.RegisterBlockManager
+BlockManagerMessages.RegisterBlockManager$
+BlockManagerMessages.RemoveBlock
+BlockManagerMessages.RemoveBlock$
+BlockManagerMessages.RemoveBroadcast
+BlockManagerMessages.RemoveBroadcast$
+BlockManagerMessages.RemoveExecutor
+BlockManagerMessages.RemoveExecutor$
+BlockManagerMessages.RemoveRdd
+BlockManagerMessages.RemoveRdd$
+BlockManagerMessages.RemoveShuffle
+BlockManagerMessages.RemoveShuffle$
+BlockManagerMessages.ReplicateBlock
+BlockManagerMessages.ReplicateBlock$
+BlockManagerMessages.StopBlockManagerMaster$
+BlockManagerMessages.ToBlockManagerMaster
+BlockManagerMessages.ToBlockManagerSlave
+BlockManagerMessages.TriggerThreadDump$
+BlockManagerMessages.UpdateBlockInfo
+BlockManagerMessages.UpdateBlockInfo$
+BlockMatrix
+BlockNotFoundException
+BlockReplicationPolicy
+BlockReplicationUtils
+BlockStatus
+BlockUpdatedInfo
+BloomFilter
+BloomFilter.Version
+BooleanParam
+BooleanType
+BoostingStrategy
+BoundedDouble
+BreezeUtil
+Broadcast
+BroadcastBlockId
+BucketedRandomProjectionLSH
+BucketedRandomProjectionLSHModel
+Bucketizer
+BufferReleasingInputStream
+BytecodeUtils
+ByteType
+CalendarIntervalType
+Catalog
+CatalystScan
+CategoricalSplit
+CausedBy
+CharType
+CheckpointReader
+CheckpointState
+ChiSqSelector
+ChiSqSelector
+ChiSqSelectorModel
+ChiSqSelectorModel
+ChiSqSelectorModel.SaveLoadV1_0$
+ChiSqTest
+ChiSqTest.Method
+ChiSqTest.Method$
+ChiSqTest.NullHypothesis$
+ChiSqTestResult
+ChiSquareTest
+CholeskyDecomposition
+ClassificationModel
+ClassificationModel
+Classifier
+CleanAccum
+CleanBroadcast
+CleanCheckpoint
+CleanRDD
+CleanShuffle
+CleanupTask
+CleanupTaskWeakReference
+ClosureCleaner
+ClusteredDistribution
+ClusteringEvaluator
+ClusteringSummary
+CoarseGrainedClusterMessages
+CoarseGrainedClusterMessages.AddWebUIFilter
+CoarseGrainedClusterMessages.AddWebUIFilter$
+CoarseGrainedClusterMessages.Get

[49/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/00Index.html
--
diff --git a/site/docs/2.3.2/api/R/00Index.html 
b/site/docs/2.3.2/api/R/00Index.html
new file mode 100644
index 000..ec589d2
--- /dev/null
+++ b/site/docs/2.3.2/api/R/00Index.html
@@ -0,0 +1,1865 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>
+http://www.w3.org/1999/xhtml";>
+R: R Frontend for Apache Spark
+
+
+
+ R Frontend for Apache Spark
+http://stat.ethz.ch/R-manual/R-devel/doc/html/logo.jpg"; alt="[R logo]" />
+
+
+
+http://stat.ethz.ch/R-manual/R-devel/doc/html/packages.html";>http://stat.ethz.ch/R-manual/R-devel/doc/html/left.jpg"; 
alt="[Up]" />
+http://stat.ethz.ch/R-manual/R-devel/doc/html/index.html";>http://stat.ethz.ch/R-manual/R-devel/doc/html/up.jpg"; 
alt="[Top]" />
+Documentation for package ‘SparkR’ version 2.3.2
+
+DESCRIPTION file.
+
+
+Help Pages
+
+
+
+A
+B
+C
+D
+E
+F
+G
+H
+I
+J
+K
+L
+M
+N
+O
+P
+Q
+R
+S
+T
+U
+V
+W
+Y
+misc
+
+
+
+-- A --
+
+
+abs
+Math functions for Column operations
+abs-method
+Math functions for Column operations
+acos
+Math functions for Column operations
+acos-method
+Math functions for Column operations
+add_months
+Date time arithmetic functions for Column operations
+add_months-method
+Date time arithmetic functions for Column operations
+AFTSurvivalRegressionModel-class
+S4 class that represents a AFTSurvivalRegressionModel
+agg
+summarize
+agg-method
+summarize
+alias
+alias
+alias-method
+alias
+ALSModel-class
+S4 class that represents an ALSModel
+approxCountDistinct
+Aggregate functions for Column operations
+approxCountDistinct-method
+Aggregate functions for Column operations
+approxQuantile
+Calculates the approximate quantiles of numerical columns of a 
SparkDataFrame
+approxQuantile-method
+Calculates the approximate quantiles of numerical columns of a 
SparkDataFrame
+arrange
+Arrange Rows by Variables
+arrange-method
+Arrange Rows by Variables
+array_contains
+Collection functions for Column operations
+array_contains-method
+Collection functions for Column operations
+as.data.frame
+Download data from a SparkDataFrame into a R data.frame
+as.data.frame-method
+Download data from a SparkDataFrame into a R data.frame
+as.DataFrame
+Create a SparkDataFrame
+as.DataFrame.default
+Create a SparkDataFrame
+asc
+A set of operations working with SparkDataFrame columns
+ascii
+String functions for Column operations
+ascii-method
+String functions for Column operations
+asin
+Math functions for Column operations
+asin-method
+Math functions for Column operations
+associationRules-method
+FP-growth
+atan
+Math functions for Column operations
+atan-method
+Math functions for Column operations
+atan2
+Math functions for Column operations
+atan2-method
+Math functions for Column operations
+attach
+Attach SparkDataFrame to R search path
+attach-method
+Attach SparkDataFrame to R search path
+avg
+avg
+avg-method
+avg
+awaitTermination
+awaitTermination
+awaitTermination-method
+awaitTermination
+
+
+-- B --
+
+
+base64
+String functions for Column operations
+base64-method
+String functions for Column operations
+between
+between
+between-method
+between
+bin
+Math functions for Column operations
+bin-method
+Math functions for Column operations
+BisectingKMeansModel-class
+S4 class that represents a BisectingKMeansModel
+bitwiseNOT
+Non-aggregate functions for Column operations
+bitwiseNOT-method
+Non-aggregate functions for Column operations
+broadcast
+broadcast
+broadcast-method
+broadcast
+bround
+Math functions for Column operations
+bround-method
+Math functions for Column operations
+
+
+-- C --
+
+
+cache
+Cache
+cache-method
+Cache
+cacheTable
+Cache Table
+cacheTable.default
+Cache Table
+cancelJobGroup
+Cancel active jobs for the specified group
+cancelJobGroup.default
+Cancel active jobs for the specified group
+cast
+Casts the column to a different data type.
+cast-method
+Casts the column to a different data type.
+cbrt
+Math functions for Column operations
+cbrt-method
+Math functions for Column operations
+ceil
+Math functions for Column operations
+ceil-method
+Math functions for Column operations
+ceiling
+Math functions for Column operations
+ceiling-method
+Math functions for Column operations
+checkpoint
+checkpoint
+checkpoint-method
+checkpoint
+clearCache
+Clear Cache
+clearCache.default
+Clear Cache
+clearJobGroup
+Clear current job group ID and its description
+clearJobGroup.default
+Clear current job group ID and its description
+coalesce
+Coalesce
+coalesce-method
+Coalesce
+coalesce-method
+Non-aggregate functions for Column operations
+collect
+Collects all the elements of a SparkDataFrame and coerces them into an R 
data.frame.
+collect-method
+Collects all the elements of a SparkDataFrame and coerces them into an R 
data.frame.
+collect_list
+Aggregate functions for Column operations
+collect_list-method
+Aggregate functions for Column op

[45/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/cov.html
--
diff --git a/site/docs/2.3.2/api/R/cov.html b/site/docs/2.3.2/api/R/cov.html
new file mode 100644
index 000..ec96abb
--- /dev/null
+++ b/site/docs/2.3.2/api/R/cov.html
@@ -0,0 +1,137 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: cov
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+cov 
{SparkR}R Documentation
+
+cov
+
+Description
+
+Compute the covariance between two expressions.
+
+
+
+Usage
+
+
+cov(x, ...)
+
+covar_samp(col1, col2)
+
+covar_pop(col1, col2)
+
+## S4 method for signature 'characterOrColumn'
+cov(x, col2)
+
+## S4 method for signature 'characterOrColumn,characterOrColumn'
+covar_samp(col1, col2)
+
+## S4 method for signature 'characterOrColumn,characterOrColumn'
+covar_pop(col1, col2)
+
+## S4 method for signature 'SparkDataFrame'
+cov(x, colName1, colName2)
+
+
+
+Arguments
+
+
+x
+
+a Column or a SparkDataFrame.
+
+...
+
+additional argument(s). If x is a Column, a Column
+should be provided. If x is a SparkDataFrame, two column names 
should
+be provided.
+
+col1
+
+the first Column.
+
+col2
+
+the second Column.
+
+colName1
+
+the name of the first column
+
+colName2
+
+the name of the second column
+
+
+
+
+Details
+
+cov: Compute the sample covariance between two expressions.
+
+covar_sample: Alias for cov.
+
+covar_pop: Computes the population covariance between two 
expressions.
+
+cov: When applied to SparkDataFrame, this calculates the 
sample covariance of two
+numerical columns of one SparkDataFrame.
+
+
+
+Value
+
+The covariance of the two columns.
+
+
+
+Note
+
+cov since 1.6.0
+
+covar_samp since 2.0.0
+
+covar_pop since 2.0.0
+
+cov since 1.6.0
+
+
+
+See Also
+
+Other aggregate functions: avg,
+column_aggregate_functions,
+corr, count,
+first, last
+
+Other stat functions: approxQuantile,
+corr, crosstab,
+freqItems, sampleBy
+
+
+
+Examples
+
+## Not run: 
+##D df <- createDataFrame(cbind(model = rownames(mtcars), mtcars))
+##D head(select(df, cov(df$mpg, df$hp), cov("mpg", "hp"),
+##D covar_samp(df$mpg, df$hp), covar_samp("mpg", 
"hp"),
+##D covar_pop(df$mpg, df$hp), covar_pop("mpg", 
"hp")))
+## End(Not run)
+
+## Not run: 
+##D cov(df, "mpg", "hp")
+##D cov(df, df$mpg, df$hp)
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/createDataFrame.html
--
diff --git a/site/docs/2.3.2/api/R/createDataFrame.html 
b/site/docs/2.3.2/api/R/createDataFrame.html
new file mode 100644
index 000..0cf668e
--- /dev/null
+++ b/site/docs/2.3.2/api/R/createDataFrame.html
@@ -0,0 +1,90 @@
+http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd";>http://www.w3.org/1999/xhtml";>R: Create a 
SparkDataFrame
+
+
+
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/styles/github.min.css";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/highlight.min.js";>
+https://cdnjs.cloudflare.com/ajax/libs/highlight.js/8.3/languages/r.min.js";>
+hljs.initHighlightingOnLoad();
+
+
+createDataFrame {SparkR}R 
Documentation
+
+Create a SparkDataFrame
+
+Description
+
+Converts R data.frame or list into SparkDataFrame.
+
+
+
+Usage
+
+
+## Default S3 method:
+createDataFrame(data, schema = NULL,
+  samplingRatio = 1, numPartitions = NULL)
+
+## Default S3 method:
+as.DataFrame(data, schema = NULL, samplingRatio = 1,
+  numPartitions = NULL)
+
+as.DataFrame(data, ...)
+
+
+
+Arguments
+
+
+data
+
+a list or data.frame.
+
+schema
+
+a list of column names or named list (StructType), optional.
+
+samplingRatio
+
+Currently not used.
+
+numPartitions
+
+the number of partitions of the SparkDataFrame. Defaults to 1, this is
+limited by length of the list or number of rows of the data.frame
+
+...
+
+additional argument(s).
+
+
+
+
+Value
+
+A SparkDataFrame.
+
+
+
+Note
+
+createDataFrame since 1.4.0
+
+as.DataFrame since 1.6.0
+
+
+
+Examples
+
+## Not run: 
+##D sparkR.session()
+##D df1 <- as.DataFrame(iris)
+##D df2 <- as.DataFrame(list(3,4,5,6))
+##D df3 <- createDataFrame(iris)
+##D df4 <- createDataFrame(cars, numPartitions = 2)
+## End(Not run)
+
+
+
+[Package SparkR version 2.3.2 
Index]
+

http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/createExternalTable-deprecated.html
--
diff --git a/site/docs/2.3.2/api/R/createExternalTable-deprecated.html 
b/site/docs/2.3.2/api/R/createExternalTable-deprecated.html
new file mode 100644
ind

spark-website git commit: Empty commit to trigger asf to github sync

Repository: spark-website
Updated Branches:
  refs/heads/asf-site 04a27dbf1 -> 546f35143


Empty commit to trigger asf to github sync


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/546f3514
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/546f3514
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/546f3514

Branch: refs/heads/asf-site
Commit: 546f351430ed42aaee6c9c80a024af296d341997
Parents: 04a27db
Author: jerryshao 
Authored: Wed Sep 26 16:38:25 2018 +0800
Committer: jerryshao 
Committed: Wed Sep 26 16:38:25 2018 +0800

--

--



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[1/3] spark-website git commit: Update some missing changes for 2.3.2 release

Repository: spark-website
Updated Branches:
  refs/heads/asf-site 546f35143 -> 74d902cdc


http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/releases/spark-release-1-3-0.html
--
diff --git a/site/releases/spark-release-1-3-0.html 
b/site/releases/spark-release-1-3-0.html
index c6e3340..ffedced 100644
--- a/site/releases/spark-release-1-3-0.html
+++ b/site/releases/spark-release-1-3-0.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/releases/spark-release-1-3-1.html
--
diff --git a/site/releases/spark-release-1-3-1.html 
b/site/releases/spark-release-1-3-1.html
index 2c45db7..1e001c0 100644
--- a/site/releases/spark-release-1-3-1.html
+++ b/site/releases/spark-release-1-3-1.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/releases/spark-release-1-4-0.html
--
diff --git a/site/releases/spark-release-1-4-0.html 
b/site/releases/spark-release-1-4-0.html
index 80dc23d..dfb4bcf 100644
--- a/site/releases/spark-release-1-4-0.html
+++ b/site/releases/spark-release-1-4-0.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/releases/spark-release-1-4-1.html
--
diff --git a/site/releases/spark-release-1-4-1.html 
b/site/releases/spark-release-1-4-1.html
index 2455361..1218be8 100644
--- a/site/releases/spark-release-1-4-1.html
+++ b/site/releases/spark-release-1-4-1.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/releases/spark-release-1-5-0.html
--
diff --git a/site/releases/spark-release-1-5-0.html 
b/site/releases/spark-release-1-5-0.html
index a8baf78..4adeaf2 100644
--- a/site/releases/spark-release-1-5-0.html
+++ b/site/releases/spark-release-1-5-0.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/releases/spark-release-1-5-1.html
--
diff --git a/site/releases/spark-release-1-5-1.html 
b/site/releases/spark-release-1-5-1.html
index dc73191..67abb35 100644
--- a/site/releases/spark-release-1-5-1.html
+++ b/site/releases/spark-release-1-5-1.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/r

[2/3] spark-website git commit: Update some missing changes for 2.3.2 release

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/news/spark-2-3-0-released.html
--
diff --git a/site/news/spark-2-3-0-released.html 
b/site/news/spark-2-3-0-released.html
index 022adba..e36b8b3 100644
--- a/site/news/spark-2-3-0-released.html
+++ b/site/news/spark-2-3-0-released.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/news/spark-2-3-1-released.html
--
diff --git a/site/news/spark-2-3-1-released.html 
b/site/news/spark-2-3-1-released.html
index 4103faa..e5ad8eb 100644
--- a/site/news/spark-2-3-1-released.html
+++ b/site/news/spark-2-3-1-released.html
@@ -162,6 +162,9 @@
   Latest News
   
 
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
   Spark+AI Summit (October 
2-4th, 2018, London) agenda posted
   (Jul 24, 2018)
 
@@ -171,9 +174,6 @@
   Spark 2.1.3 
released
   (Jun 29, 2018)
 
-  Spark 2.3.1 
released
-  (Jun 08, 2018)
-
   
   Archive
 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/74d902cd/site/news/spark-2-3-2-released.html
--
diff --git a/site/news/spark-2-3-2-released.html 
b/site/news/spark-2-3-2-released.html
new file mode 100644
index 000..e7d71b5
--- /dev/null
+++ b/site/news/spark-2-3-2-released.html
@@ -0,0 +1,232 @@
+
+
+
+  
+  
+  
+
+  
+ Spark 2.3.2 released | Apache Spark
+
+  
+
+  
+
+  
+
+  
+  
+  
+
+  
+  
+
+  
+  
+  var _gaq = _gaq || [];
+  _gaq.push(['_setAccount', 'UA-32518208-2']);
+  _gaq.push(['_trackPageview']);
+  (function() {
+var ga = document.createElement('script'); ga.type = 'text/javascript'; 
ga.async = true;
+ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 
'http://www') + '.google-analytics.com/ga.js';
+var s = document.getElementsByTagName('script')[0]; 
s.parentNode.insertBefore(ga, s);
+  })();
+
+  
+  function trackOutboundLink(link, category, action) {
+try {
+  _gaq.push(['_trackEvent', category , action]);
+} catch(err){}
+
+setTimeout(function() {
+  document.location.href = link.href;
+}, 100);
+  }
+  
+
+  
+  
+
+
+
+
+https://code.jquery.com/jquery.js";>
+https://netdna.bootstrapcdn.com/bootstrap/3.0.3/js/bootstrap.min.js";>
+
+
+
+
+
+
+  
+
+  
+  
+  Lightning-fast unified analytics engine
+  
+
+  
+
+
+
+  
+  
+
+  Toggle navigation
+  
+  
+  
+
+  
+
+  
+  
+
+  Download
+  
+
+  Libraries 
+
+
+  SQL and DataFrames
+  Spark Streaming
+  MLlib (machine learning)
+  GraphX (graph)
+  
+  Third-Party 
Projects
+
+  
+  
+
+  Documentation 
+
+
+  Latest Release (Spark 2.3.2)
+  Older Versions and Other 
Resources
+  Frequently Asked Questions
+
+  
+  Examples
+  
+
+  Community 
+
+
+  Mailing Lists & Resources
+  Contributing to Spark
+  Improvement Proposals 
(SPIP)
+  https://issues.apache.org/jira/browse/SPARK";>Issue 
Tracker
+  Powered By
+  Project Committers
+  Project History
+
+  
+  
+
+   Developers 
+
+
+  Useful Developer Tools
+  Versioning Policy
+  Release Process
+  Security
+
+  
+
+
+  
+https://www.apache.org/"; class="dropdown-toggle" 
data-toggle="dropdown">
+  Apache Software Foundation 
+
+  https://www.apache.org/";>Apache Homepage
+  https://www.apache.org/licenses/";>License
+  https://www.apache.org/foundation/sponsorship.html";>Sponsorship
+  https://www.apache.org/foundation/thanks.html";>Thanks
+  https://www.apache.org/security/";>Security
+
+  
+
+  
+  
+
+
+
+
+  
+
+  Latest News
+  
+
+  Spark 2.3.2 
released
+  (Sep 24, 2018)
+
+  Spark+AI Summit (October 
2-4th, 2018, London) agenda posted

[3/3] spark-website git commit: Update some missing changes for 2.3.2 release

Update some missing changes for 2.3.2 release

`downloads.md` and `_post` should be updated to use new 2.3.2 release. Sorry 
about missing it.

Author: jerryshao 

Closes #150 from jerryshao/update-2.3.2.


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/74d902cd
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/74d902cd
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/74d902cd

Branch: refs/heads/asf-site
Commit: 74d902cdc9832d90bf5262481bf9867900ce7a2c
Parents: 546f351
Author: jerryshao 
Authored: Wed Sep 26 17:58:39 2018 +0800
Committer: jerryshao 
Committed: Wed Sep 26 17:58:39 2018 +0800

--
 downloads.md|   2 +-
 news/_posts/2018-09-24-spark-2-3-2-released.md  |  14 ++
 site/committers.html|   6 +-
 site/community.html |   6 +-
 site/contributing.html  |   6 +-
 site/developer-tools.html   |   6 +-
 site/documentation.html |   6 +-
 site/downloads.html |   8 +-
 site/examples.html  |   6 +-
 site/faq.html   |   6 +-
 site/graphx/index.html  |   6 +-
 site/history.html   |   6 +-
 site/improvement-proposals.html |   6 +-
 site/index.html |   6 +-
 site/mailing-lists.html |   6 +-
 site/mllib/index.html   |   6 +-
 site/news/amp-camp-2013-registration-ope.html   |   6 +-
 .../news/announcing-the-first-spark-summit.html |   6 +-
 .../news/fourth-spark-screencast-published.html |   6 +-
 site/news/index.html|  15 +-
 site/news/nsdi-paper.html   |   6 +-
 site/news/one-month-to-spark-summit-2015.html   |   6 +-
 .../proposals-open-for-spark-summit-east.html   |   6 +-
 ...registration-open-for-spark-summit-east.html |   6 +-
 .../news/run-spark-and-shark-on-amazon-emr.html |   6 +-
 site/news/spark-0-6-1-and-0-5-2-released.html   |   6 +-
 site/news/spark-0-6-2-released.html |   6 +-
 site/news/spark-0-7-0-released.html |   6 +-
 site/news/spark-0-7-2-released.html |   6 +-
 site/news/spark-0-7-3-released.html |   6 +-
 site/news/spark-0-8-0-released.html |   6 +-
 site/news/spark-0-8-1-released.html |   6 +-
 site/news/spark-0-9-0-released.html |   6 +-
 site/news/spark-0-9-1-released.html |   6 +-
 site/news/spark-0-9-2-released.html |   6 +-
 site/news/spark-1-0-0-released.html |   6 +-
 site/news/spark-1-0-1-released.html |   6 +-
 site/news/spark-1-0-2-released.html |   6 +-
 site/news/spark-1-1-0-released.html |   6 +-
 site/news/spark-1-1-1-released.html |   6 +-
 site/news/spark-1-2-0-released.html |   6 +-
 site/news/spark-1-2-1-released.html |   6 +-
 site/news/spark-1-2-2-released.html |   6 +-
 site/news/spark-1-3-0-released.html |   6 +-
 site/news/spark-1-4-0-released.html |   6 +-
 site/news/spark-1-4-1-released.html |   6 +-
 site/news/spark-1-5-0-released.html |   6 +-
 site/news/spark-1-5-1-released.html |   6 +-
 site/news/spark-1-5-2-released.html |   6 +-
 site/news/spark-1-6-0-released.html |   6 +-
 site/news/spark-1-6-1-released.html |   6 +-
 site/news/spark-1-6-2-released.html |   6 +-
 site/news/spark-1-6-3-released.html |   6 +-
 site/news/spark-2-0-0-released.html |   6 +-
 site/news/spark-2-0-1-released.html |   6 +-
 site/news/spark-2-0-2-released.html |   6 +-
 site/news/spark-2-1-0-released.html |   6 +-
 site/news/spark-2-1-1-released.html |   6 +-
 site/news/spark-2-1-2-released.html |   6 +-
 site/news/spark-2-1-3-released.html |   6 +-
 site/news/spark-2-2-0-released.html |   6 +-
 site/news/spark-2-2-1-released.html |   6 +-
 site/news/spark-2-2-2-released.html |   6 +-
 site/news/spark-2-3-0-released.html |   6 +-
 site/news/spark-2-3-1-released.html |   6 +-
 site/news/spark-2-3-2-released.html | 232 +++
 site/news/spark-2.0.0-preview.html  |   6 +-
 .../spark-accepted-into-apache-incubator.html   |   6 +-
 site/news/spark-and-shark-in-the-news.html  |   6 +-
 site/news/spark-becomes-tlp.html|   6 +-
 site/news/spark-featured-in-wired.html  |   6 +-
 .../spark-mailing-lists-moving-to-apache.html   |   6 +-
 site/news/spark-m

spark-website git commit: Update my affiliation

Repository: spark-website
Updated Branches:
  refs/heads/asf-site 74d902cdc -> 8b7444182


Update my affiliation


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/8b744418
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/8b744418
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/8b744418

Branch: refs/heads/asf-site
Commit: 8b7444182083e968e6dbfd1def2f5cb1635b2465
Parents: 74d902c
Author: jerryshao 
Authored: Thu Sep 27 19:42:30 2018 +0800
Committer: jerryshao 
Committed: Thu Sep 27 19:42:30 2018 +0800

--
 committers.md| 2 +-
 site/committers.html | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark-website/blob/8b744418/committers.md
--
diff --git a/committers.md b/committers.md
index b64e278..957ed4c 100644
--- a/committers.md
+++ b/committers.md
@@ -61,7 +61,7 @@ navigation:
 |Josh Rosen|Databricks|
 |Sandy Ryza|Remix|
 |Kousuke Saruta|NTT Data|
-|Saisai Shao|Hortonworks|
+|Saisai Shao|Tencent|
 |Prashant Sharma|IBM|
 |Ram Sriharsha|Databricks|
 |DB Tsai|Apple|

http://git-wip-us.apache.org/repos/asf/spark-website/blob/8b744418/site/committers.html
--
diff --git a/site/committers.html b/site/committers.html
index bc65924..7bc47a0 100644
--- a/site/committers.html
+++ b/site/committers.html
@@ -416,7 +416,7 @@
 
 
   Saisai Shao
-  Hortonworks
+  Tencent
 
 
   Prashant Sharma


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r27854 - /dev/spark/KEYS

Author: jshao
Date: Mon Jul  2 12:18:41 2018
New Revision: 27854

Log:
Update KEYS

Modified:
dev/spark/KEYS

Modified: dev/spark/KEYS
==
--- dev/spark/KEYS (original)
+++ dev/spark/KEYS Mon Jul  2 12:18:41 2018
@@ -646,3 +646,60 @@ UiZuJIMHLPJK4sbOj5nMZE2163zUXz+gOVMeLqVx
 qG1EJoF+tteqemi1ZwYipD06wA==
 =cA5d
 -END PGP PUBLIC KEY BLOCK-
+pub   4096R/12973FD0 2017-08-07
+uid  Saisai Shao (CODE SIGNING KEY) 
+sig 312973FD0 2017-08-07  Saisai Shao (CODE SIGNING KEY) 

+sub   4096R/A0A12D58 2017-08-07
+sig  12973FD0 2017-08-07  Saisai Shao (CODE SIGNING KEY) 

+
+-BEGIN PGP PUBLIC KEY BLOCK-
+
+mQINBFmH6vwBEADMnQZREQFXHjAcJxKh9o2rcgDJRQAZkE5kPVTFtkYskaNSDGP2
+A1TdKvOHRG79/9hLz3Ijvh6whwQ5a5JUiiCT06xeKLmJOQDOwYhEnUnTjxkNzAj0
+47Btno7vD1GKY7Yd67QortEXQxiWMoxDDYEDiHepuAZ+YgPOK0j0FBHyTHtikWLW
+dDEJGWl5KGM3y1xxASGEUCaYpM/cpN+fP3rrvKoQ6H4/HlgSdGwGivnIhZSaZ9rM
+/Uj5aWnBxhTOm4mkZDL26mtXH8UQsRA72kz1SC0wTQLl097VNmh626Hmhk7eUz/d
+bW3fe2fL2ouXbyToOu8UV9+KPXJmZg6bLxejRo9CKLAtOMd+cxymhJYfzSUNh6vJ
+VfStLYqySy8Oq6swe8M6W/2Asw6nsUdbItyReDvnGyEmxIuoiRoqtIo5hY5GT6WI
+wwLn7wPIkozIyEd9GDbYGzDclff3i3lOXEfZ6lOwtYNSOZlPYW2/ZyxAV+sa/itd
+fZJ8x1TH3jnNg/yELAVTaLKZ+XbDut3NDSjh8zzgwr8EYKr8UaZhzULfwDoGL6AJ
+88aJqEUHKD0z/wgvow1Ts6mW5l2ego3235oeNMd/hJXbbGFn7EFNRxqF0xFcRwP0
+kfUC6twcENJ+jYD/bUQ0lt4PLHX6ziShAJynTdu1OabHSEsFa19XUpHWZwARAQAB
+tDFTYWlzYWkgU2hhbyAoQ09ERSBTSUdOSU5HIEtFWSkgPGpzaGFvQGFwYWNoZS5v
+cmc+iQI3BBMBCgAhBQJZh+r8AhsDBQsJCAcDBRUKCQgLBRYCAwEAAh4BAheAAAoJ
+ENsLIaASlz/Q9uYP+wQ9jn5OqMHSv38HSWxBfuLbI1Jb90Ayx9Lg5KqEazBy5+Xc
+2o5hKV7t5LrRY0ZW5P/mLibJC8tdakZCjMoSFbefVFUQzMw5ACHWXL1wYNcrlIcc
+EA2muMRS8v6o1TYfLTMgP6QanAGseY7DFrk3lc38IA5KBJ5g7Q80I+pZswQOdiM6
+vsWskkRkdaoS5Ku1i8DENJgx9bJUrreGtrzRdVX2JfqRuufozTH5OHOFc2WHwosR
++scfmaAmScRxt1Xi6Vrm9NhoynWZro4XfhNzfk81CWMy2DT90CMTuK/DMCE3TH9n
+Y0HG6gPL1gDJiGhhnhWK1k2RpCts0fOt81zNikdty5nhHnjCojaDdK79kYgPsWwT
+0mBAMArKbHD650dpwS4wV/kqc0CFolL8Haj5f2caNuqAH5HH3yp2Qgdd5bHbUS/0
+CdTskLdCDPcJpsGGXjnLdBA6XOppjvaZtuaMS5SXJZ/mt6YwKzgasI6SrVkI4oEd
+/o+YHm8HP0Xm/S5mURSflJGDRyFrkuHoKhID4LjLKOeTlP4NRLCy8TAYvVJ91BNL
+u9fyvez9fNzdlWLRoBnt3grQRmf9dWxm6f1ZQVpVTYMls+ACwiHFzneNfli2LjGK
+clsG2VyGRVaPykhvSCFuhQpaSpaSKiZK4Ux9AjsJnOVT9heUBJBDSuTOn5GRuQIN
+BFmH6vwBEAC7jbxZY0TqNFNzxpx3BucUXOOe2Mhi52/6uPvXetFguRj12NkC5/vx
+s2EpjyZR37YG9WXoHd9UZ0PI1FbeH0xBPF6yy7zNALxiSOpEciFumQMetzlWFHzC
+yCUuo1vt9/Mmy0meSxnwSPq5juSBYGll2r7FZ8dpmJNUWE8njJttYDhse/b9WfGF
+BOFckR1k3xnMNjTm8JXyVQC48HvZVUZLWWQnIp4lUD0UwXqjYT9LljNG3kRbP4H2
+SbW7ijL0ioeDUeGM2pavaW5C8nEONgEwcS4yi1uCSVm8dzRe4ZY/jw4NehkV1SSB
+l51NB74WoGJ0tcjHs2+2Gvzr2DKxEqJNePdx1jQncnSnTrvlKAAnR4HbS61d7K/T
+eI3DdQ5YSmM7Dyi/Bg/FJ3pWLpNMGKB1vX7yK2gguj4MoMvm9mYZf9vR6tIO1ZBt
+YUKZjn9s0Q6aLXTk173u1gqfpzYigc5F5c+eGhRgZlmve0N1zT3I68buIzF/VO1g
+AvOCwxC7qSqu07M7AUvwJwTzBplZNNj7QvoKqqSY0pT7pNwTMH1/jWHOi1jb/Oj2
+Y6DxVr6mDjsZd4/2nPZ4Nv3D+KP8q41tX3tThQ+prDUVSFXWWDxgU27EJLS9FnkU
+2UY+FuxW34svUEpGHrISUbxDIzP2HL4hm1mSIk3ua4RqGQYcFR3ElwARAQABiQIf
+BBgBCgAJBQJZh+r8AhsMAAoJENsLIaASlz/QmXMP/jLtuJhbFgIpztMa+Hv6Isxv
+QQ2/6JdKLUCFLHMHAPoh1R97WnEr2l3DJCkwGNW2al3S75Zgj1KC0H/Nm5gxrLYp
+1ne3aY7U90RwcXu5oqABdGQvLZmmo7O3BNZDbTf22XvSndS0sjE/+lJXzAUfpx2E
+qu/0URnHxawOI8MPitoeszGrN1zJ9uTFhVvhLUMJ+JxlBmPMr1rUhuUddaQdldhk
+JopwxcTQN4S+tCbFVz/PNcJUTL7VRABh5wHY5mOmvjxnG/2MNN6zmSfX5pCl6pgf
+SliQ40wV2Sf4L+r/M/DvzHpycuzzLkLmcO9y5+/AtM9JrPRxNg32ycX8GuXnhUTZ
+FlxfLw2nGCMRb5a9u15VfGvh037WYt7CmDUGh64f41m/wRangrPdicz3sy9TEKjs
+OMA1OAVFovv5Nd3QS4IrM2lpVJSaBRvcL67NqJ+fE9CAsBUHxAjU/5wBAuT8xxiF
+5r27fHM4Fq/Hy3aT16PJblVMbmf2V4z3Q7Je2xP0IP1V249a4nyXKPH+6FksUiG3
+Yv4vngYdI7Kw5/5kogYIVT1Es4cRW/9NxiK4360uIsecILpaDPtf7+EQH7DU5f2l
+iI08G08QlT8LxLCvMSn81txtKAdPssT+pnMpTQo2vvrssG5ArImZc5InZDy+EutS
+kyHyHY5kPG9HfDOSahPz
+=SDAz
+-END PGP PUBLIC KEY BLOCK-



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] Git Push Summary

Repository: spark
Updated Tags:  refs/tags/v2.3.2-rc1 [created] 4df06b451

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

Preparing development version 2.3.3-SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/72eb97ce
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/72eb97ce
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/72eb97ce

Branch: refs/heads/branch-2.3
Commit: 72eb97ce96c27e296aa6ed8198246000a78fb10b
Parents: 4df06b4
Author: Saisai Shao 
Authored: Sun Jul 8 01:24:55 2018 +
Committer: Saisai Shao 
Committed: Sun Jul 8 01:24:55 2018 +

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/72eb97ce/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 8df2635..6ec4966 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.2
+Version: 2.3.3
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/72eb97ce/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 57485fc..f8b15cc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/72eb97ce/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 53e58c2..e412a47 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/72eb97ce/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d05647c..d8f9a3d 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/72eb97ce/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index 8d46761..a1a4f87 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2

[1/2] spark git commit: Preparing Spark release v2.3.2-rc1

Repository: spark
Updated Branches:
  refs/heads/branch-2.3 64c72b4de -> 72eb97ce9


Preparing Spark release v2.3.2-rc1


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4df06b45
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/4df06b45
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/4df06b45

Branch: refs/heads/branch-2.3
Commit: 4df06b45160241dbb331153efbb25703f913c192
Parents: 64c72b4
Author: Saisai Shao 
Authored: Sun Jul 8 01:24:42 2018 +
Committer: Saisai Shao 
Committed: Sun Jul 8 01:24:42 2018 +

--
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 2 +-
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 40 files changed, 40 insertions(+), 40 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/4df06b45/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 02bf39b..57485fc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2-SNAPSHOT
+2.3.2
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/4df06b45/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 646fdfb..53e58c2 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/4df06b45/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index 76c7dcf..d05647c 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/4df06b45/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index f2661fe..8d46761 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/4df06b45/common/network-yarn/pom.xml
--
diff --git a/common/network-yarn/pom.xml b/common/network-yarn/pom.xml
index 229d466..11c290d 100644
--- a/common/network-yarn/pom.xml
+++ b/common/network-yarn/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

h

svn commit: r27981 - /dev/spark/v2.3.2-rc1-bin/

Author: jshao
Date: Sun Jul  8 04:41:12 2018
New Revision: 27981

Log:
Apache Spark 2.3.2

Added:
dev/spark/v2.3.2-rc1-bin/
dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz   (with props)
dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.sha512
dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz   (with props)
dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.sha512
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-hadoop2.6.tgz   (with props)
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-hadoop2.6.tgz.asc
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-hadoop2.6.tgz.sha512
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-hadoop2.7.tgz   (with props)
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-hadoop2.7.tgz.asc
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-hadoop2.7.tgz.sha512
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-without-hadoop.tgz   (with props)
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-without-hadoop.tgz.asc
dev/spark/v2.3.2-rc1-bin/spark-2.3.2-bin-without-hadoop.tgz.sha512
dev/spark/v2.3.2-rc1-bin/spark-2.3.2.tgz   (with props)
dev/spark/v2.3.2-rc1-bin/spark-2.3.2.tgz.asc
dev/spark/v2.3.2-rc1-bin/spark-2.3.2.tgz.sha512

Added: dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.asc
==
--- dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.asc (added)
+++ dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.asc Sun Jul  8 04:41:12 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIcBAABCgAGBQJbQYuSAAoJENsLIaASlz/QmO4QALqSjYMcRMPssb5VNctoIxS6
+E57s9FRJjOHmxAzTp/5re44crpeWiZVNtMesT/BCg5clS53R/YAjAJOxivx8sV6s
+l2cwIt/xZKsQF3PYQnv1WVaGN9So5dlSYZGPyAm9bOp1es5MEAvAn7FBms2ttGsv
+M6KYiWPF2DLzo/OVGb7YMH6e9TkjUt/aFcE5ll3UQ4vRNcvxOEO/zpc1FbOEh3sF
+cLNRCHV3LWILSatpHznsEg3IkfE35rj09qYt7Ia2ECpCvzK6vbJMRy27duz64yUI
+wGD1Yt6CBl1F0BkV/Gh0Vm0x1OnXLNSJzuNh9P3xwD7Dmi251lBasKQr8bNb3Mqs
+jsFVGnn8QEKegDkHdfuMwZBaZCeCqA/QhxGSCp20r8p7JwecCFJJEhakR3kgoQ9T
+HZ+vIyQpziW78WabHpwEhGGaygWENGhI3TVQpXTGm0JOiJv9RAmvHMZ7tn7yyXI9
+CnFR73xrk4OGjODSdVc+4nS+lB088+j0An1V0c6/3GP/D80MQzj7TSMADYlq/PwO
+9mJw0fllpJPWErlDey9WSYhyaLqo3YzrtoB7r+C9ENm6on70muJ4TBMF7Vz77BXi
+4IKsB8RE4TFETYhzPW+gkwKu0skM2QfWq4dpLmTBKGefpnEW0hZ93/GzTzeaZ+N0
+ifR8izt4HDcMRD8YSG6t
+=iiTK
+-END PGP SIGNATURE-

Added: dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.sha512
==
--- dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.sha512 (added)
+++ dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.sha512 Sun Jul  8 04:41:12 2018
@@ -0,0 +1,3 @@
+SparkR_2.3.2.tar.gz: 87320EB2 61E1637A 4099ABE8 FAB19E69 729F9D84 4696838C
+ 6F3CFB5B 75607A71 3E45080E 2ABCBFF7 5552D248 0D5078B2
+ 366E2F44 B291B98C 26820CD2 E94E0D3B

Added: dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.asc
==
--- dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.asc (added)
+++ dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.asc Sun Jul  8 04:41:12 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIcBAABCgAGBQJbQY82AAoJENsLIaASlz/Qj0YQAMOpAIzO/TkDEguG1sPW9aRs
+XzdjA0jx/FWYRUFuMwpmGd+AXusw0MScc1tBDOesgGjzgMr20O7a5O+8DEXDTjrr
+qgwBY+jn3l/nQRM3UxoMKWLYgrjTCIeXQwER4hKspiElx3BIRs0NdBmUtafauc9l
+E0I0Ysy5s5Easpi+vCqT7iEbG+1uHaQVtvJAArbJLv2CgCfRyLw+WjV01tmeG9+h
+0Wh/im+nKQphqXd3yhIXNNq43wTFQpVrALcO5F5BhAmXoVr8S+fYbtk09NKiLubZ
+/0lMJmN/HIbABxcGWpmqzWmmgI5vr8NxKq1sZ8MyJ1WPUjtJ2VQEYY68pY03sBFi
+BxwFr09oboyRIezHQAcc9ptWvupbxcmZWtRUaBVriGoacFavGbpHCKXAlVsPyl0a
+ikUGijvaWWPDu8R29HYyqFpQxguSJWD2kdjVzEyqke/m/5X45lmdwnu9wHj9NUa6
+jHDiAi+D3FptBW8SMVAfeqit423Vv1JDOzI+6IrWEtJIYFvYGn+lliRtzaG0GWUJ
+IgCK08huzG22pWVvx0m/2NcHMiddPbbNpx/nQ5KvKflVsNyScQD1yggEcW9pFZqC
+J0gg5eGidTpb98DSuvaIK+Ucv6fMQF+3CbS6YVgOh6wrB2FXSXlO5ZOeS1WoHvF4
+5Anwy5d6AD+WuWOG4XEG
+=Ez4U
+-END PGP SIGNATURE-

Added: dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.sha512
==
--- dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.sha512 (added)
+++ dev/spark/v2.3.2-rc1-bin/pyspark-2.3.2.tar.gz.sha512 Sun Jul  8 04:41:12 
2018
@@ -0,0 +1,3

svn commit: r27983 - in /dev/spark/v2.3.2-rc1-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

Author: jshao
Date: Sun Jul  8 07:45:04 2018
New Revision: 27983

Log:
Apache Spark v2.3.2-rc1 docs


[This commit notification would consist of 1446 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-24646][CORE] Minor change to spark.yarn.dist.forceDownloadSchemes to support wildcard '*'

Repository: spark
Updated Branches:
  refs/heads/master 79c668942 -> e2c7e09f7


[SPARK-24646][CORE] Minor change to spark.yarn.dist.forceDownloadSchemes to 
support wildcard '*'

## What changes were proposed in this pull request?

In the case of getting tokens via customized `ServiceCredentialProvider`, it is 
required that `ServiceCredentialProvider` be available in local spark-submit 
process classpath. In this case, all the configured remote sources should be 
forced to download to local.

For the ease of using this configuration, here propose to add wildcard '*' 
support to `spark.yarn.dist.forceDownloadSchemes`, also clarify the usage of 
this configuration.

## How was this patch tested?

New UT added.

Author: jerryshao 

Closes #21633 from jerryshao/SPARK-21917-followup.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e2c7e09f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e2c7e09f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/e2c7e09f

Branch: refs/heads/master
Commit: e2c7e09f742a7e522efd74fe8e14c2620afdb522
Parents: 79c6689
Author: jerryshao 
Authored: Mon Jul 9 10:21:40 2018 +0800
Committer: jerryshao 
Committed: Mon Jul 9 10:21:40 2018 +0800

--
 .../org/apache/spark/deploy/SparkSubmit.scala   |  5 ++--
 .../apache/spark/internal/config/package.scala  |  5 ++--
 .../apache/spark/deploy/SparkSubmitSuite.scala  | 29 +---
 docs/running-on-yarn.md |  5 ++--
 4 files changed, 28 insertions(+), 16 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/e2c7e09f/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
--
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala 
b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
index 2da778a..e7310ee 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
@@ -385,7 +385,7 @@ private[spark] class SparkSubmit extends Logging {
   val forceDownloadSchemes = sparkConf.get(FORCE_DOWNLOAD_SCHEMES)
 
   def shouldDownload(scheme: String): Boolean = {
-forceDownloadSchemes.contains(scheme) ||
+forceDownloadSchemes.contains("*") || 
forceDownloadSchemes.contains(scheme) ||
   Try { FileSystem.getFileSystemClass(scheme, hadoopConf) }.isFailure
   }
 
@@ -578,7 +578,8 @@ private[spark] class SparkSubmit extends Logging {
 }
 // Add the main application jar and any added jars to classpath in case 
YARN client
 // requires these jars.
-// This assumes both primaryResource and user jars are local jars, 
otherwise it will not be
+// This assumes both primaryResource and user jars are local jars, or 
already downloaded
+// to local by configuring "spark.yarn.dist.forceDownloadSchemes", 
otherwise it will not be
 // added to the classpath of YARN client.
 if (isYarnCluster) {
   if (isUserJar(args.primaryResource)) {

http://git-wip-us.apache.org/repos/asf/spark/blob/e2c7e09f/core/src/main/scala/org/apache/spark/internal/config/package.scala
--
diff --git a/core/src/main/scala/org/apache/spark/internal/config/package.scala 
b/core/src/main/scala/org/apache/spark/internal/config/package.scala
index bda9795..ba892bf 100644
--- a/core/src/main/scala/org/apache/spark/internal/config/package.scala
+++ b/core/src/main/scala/org/apache/spark/internal/config/package.scala
@@ -486,10 +486,11 @@ package object config {
 
   private[spark] val FORCE_DOWNLOAD_SCHEMES =
 ConfigBuilder("spark.yarn.dist.forceDownloadSchemes")
-  .doc("Comma-separated list of schemes for which files will be downloaded 
to the " +
+  .doc("Comma-separated list of schemes for which resources will be 
downloaded to the " +
 "local disk prior to being added to YARN's distributed cache. For use 
in cases " +
 "where the YARN service does not support schemes that are supported by 
Spark, like http, " +
-"https and ftp.")
+"https and ftp, or jars required to be in the local YARN client's 
classpath. Wildcard " +
+"'*' is denoted to download resources for all the schemes.")
   .stringConf
   .toSequence
   .createWithDefault(Nil)

http://git-wip-us.apache.org/repos/asf/spark/blob/e2c7e09f/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
--
diff --git a/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala 
b/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala
index 545c8d0..f829fec 100644
--- a/core/src/test/scala/org/apache/spark/

spark git commit: [SPARK-24678][SPARK-STREAMING] Give priority in use of 'PROCESS_LOCAL' for spark-streaming

Repository: spark
Updated Branches:
  refs/heads/master a28900956 -> 6fe32869c


[SPARK-24678][SPARK-STREAMING] Give priority in use of 'PROCESS_LOCAL' for 
spark-streaming

## What changes were proposed in this pull request?

Currently, `BlockRDD.getPreferredLocations`  only get hosts info of blocks, 
which results in subsequent schedule level is not better than 'NODE_LOCAL'. We 
can just make a small changes, the schedule level can be improved to 
'PROCESS_LOCAL'

## How was this patch tested?

manual test

Author: sharkdtu 

Closes #21658 from sharkdtu/master.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6fe32869
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/6fe32869
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/6fe32869

Branch: refs/heads/master
Commit: 6fe32869ccb17933e77a4dbe883e36d382fbeeec
Parents: a289009
Author: sharkdtu 
Authored: Tue Jul 10 20:18:34 2018 +0800
Committer: jerryshao 
Committed: Tue Jul 10 20:18:34 2018 +0800

--
 .../src/main/scala/org/apache/spark/rdd/BlockRDD.scala |  2 +-
 .../scala/org/apache/spark/storage/BlockManager.scala  |  7 +--
 .../org/apache/spark/storage/BlockManagerSuite.scala   | 13 +
 3 files changed, 19 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/6fe32869/core/src/main/scala/org/apache/spark/rdd/BlockRDD.scala
--
diff --git a/core/src/main/scala/org/apache/spark/rdd/BlockRDD.scala 
b/core/src/main/scala/org/apache/spark/rdd/BlockRDD.scala
index 4e036c2..23cf19d 100644
--- a/core/src/main/scala/org/apache/spark/rdd/BlockRDD.scala
+++ b/core/src/main/scala/org/apache/spark/rdd/BlockRDD.scala
@@ -30,7 +30,7 @@ private[spark]
 class BlockRDD[T: ClassTag](sc: SparkContext, @transient val blockIds: 
Array[BlockId])
   extends RDD[T](sc, Nil) {
 
-  @transient lazy val _locations = BlockManager.blockIdsToHosts(blockIds, 
SparkEnv.get)
+  @transient lazy val _locations = BlockManager.blockIdsToLocations(blockIds, 
SparkEnv.get)
   @volatile private var _isValid = true
 
   override def getPartitions: Array[Partition] = {

http://git-wip-us.apache.org/repos/asf/spark/blob/6fe32869/core/src/main/scala/org/apache/spark/storage/BlockManager.scala
--
diff --git a/core/src/main/scala/org/apache/spark/storage/BlockManager.scala 
b/core/src/main/scala/org/apache/spark/storage/BlockManager.scala
index df1a4be..0e1c7d5 100644
--- a/core/src/main/scala/org/apache/spark/storage/BlockManager.scala
+++ b/core/src/main/scala/org/apache/spark/storage/BlockManager.scala
@@ -45,6 +45,7 @@ import org.apache.spark.network.netty.SparkTransportConf
 import org.apache.spark.network.shuffle.{ExternalShuffleClient, 
TempFileManager}
 import org.apache.spark.network.shuffle.protocol.ExecutorShuffleInfo
 import org.apache.spark.rpc.RpcEnv
+import org.apache.spark.scheduler.ExecutorCacheTaskLocation
 import org.apache.spark.serializer.{SerializerInstance, SerializerManager}
 import org.apache.spark.shuffle.ShuffleManager
 import org.apache.spark.storage.memory._
@@ -1554,7 +1555,7 @@ private[spark] class BlockManager(
 private[spark] object BlockManager {
   private val ID_GENERATOR = new IdGenerator
 
-  def blockIdsToHosts(
+  def blockIdsToLocations(
   blockIds: Array[BlockId],
   env: SparkEnv,
   blockManagerMaster: BlockManagerMaster = null): Map[BlockId, 
Seq[String]] = {
@@ -1569,7 +1570,9 @@ private[spark] object BlockManager {
 
 val blockManagers = new HashMap[BlockId, Seq[String]]
 for (i <- 0 until blockIds.length) {
-  blockManagers(blockIds(i)) = blockLocations(i).map(_.host)
+  blockManagers(blockIds(i)) = blockLocations(i).map { loc =>
+ExecutorCacheTaskLocation(loc.host, loc.executorId).toString
+  }
 }
 blockManagers.toMap
   }

http://git-wip-us.apache.org/repos/asf/spark/blob/6fe32869/core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala
--
diff --git 
a/core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala 
b/core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala
index b19d8eb..08172f0 100644
--- a/core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala
+++ b/core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala
@@ -1422,6 +1422,19 @@ class BlockManagerSuite extends SparkFunSuite with 
Matchers with BeforeAndAfterE
 assert(mockBlockTransferService.tempFileManager === 
store.remoteBlockTempFileManager)
   }
 
+  test("query locations of blockIds") {
+val mockBlockManagerMaster = mock(classOf[BlockManagerMaster])
+val blockLocations = Seq(BlockManagerId("1", "hos

[1/2] spark git commit: Preparing Spark release v2.3.2-rc2

Repository: spark
Updated Branches:
  refs/heads/branch-2.3 19542f5de -> 86457a16d


Preparing Spark release v2.3.2-rc2


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/307499e1
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/307499e1
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/307499e1

Branch: refs/heads/branch-2.3
Commit: 307499e1a99c6ad3ce0b978626894ea2c1e3807e
Parents: 19542f5
Author: Saisai Shao 
Authored: Wed Jul 11 05:27:02 2018 +
Committer: Saisai Shao 
Committed: Wed Jul 11 05:27:02 2018 +

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/307499e1/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 6ec4966..8df2635 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.3
+Version: 2.3.2
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/307499e1/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index f8b15cc..57485fc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/307499e1/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index e412a47..53e58c2 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/307499e1/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d8f9a3d..d05647c 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/307499e1/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index a1a4f87..8d46761 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml

[spark] Git Push Summary

Repository: spark
Updated Tags:  refs/tags/v2.3.2-rc2 [created] 307499e1a

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

Preparing development version 2.3.3-SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/86457a16
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/86457a16
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/86457a16

Branch: refs/heads/branch-2.3
Commit: 86457a16de20eb126d0569d12f51b2e427fd03c3
Parents: 307499e
Author: Saisai Shao 
Authored: Wed Jul 11 05:27:12 2018 +
Committer: Saisai Shao 
Committed: Wed Jul 11 05:27:12 2018 +

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/86457a16/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 8df2635..6ec4966 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.2
+Version: 2.3.3
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/86457a16/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 57485fc..f8b15cc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/86457a16/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 53e58c2..e412a47 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/86457a16/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d05647c..d8f9a3d 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/86457a16/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index 8d46761..a1a4f87 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3

[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

Preparing development version 2.3.3-SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f9a2b0a8
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f9a2b0a8
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f9a2b0a8

Branch: refs/heads/branch-2.3
Commit: f9a2b0a878f05131d76959236243e7f5caffeb96
Parents: b3726da
Author: Saisai Shao 
Authored: Sun Jul 15 01:56:15 2018 +
Committer: Saisai Shao 
Committed: Sun Jul 15 01:56:15 2018 +

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/f9a2b0a8/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 8df2635..6ec4966 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.2
+Version: 2.3.3
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/f9a2b0a8/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 57485fc..f8b15cc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/f9a2b0a8/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index 53e58c2..e412a47 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/f9a2b0a8/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d05647c..d8f9a3d 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.2
+2.3.3-SNAPSHOT
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/f9a2b0a8/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index 8d46761..a1a4f87 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3

[spark] Git Push Summary

Repository: spark
Updated Tags:  refs/tags/v2.3.2-rc3 [created] b3726dadc

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[1/2] spark git commit: Preparing Spark release v2.3.2-rc3

Repository: spark
Updated Branches:
  refs/heads/branch-2.3 9cf375f5b -> f9a2b0a87


Preparing Spark release v2.3.2-rc3


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b3726dad
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/b3726dad
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/b3726dad

Branch: refs/heads/branch-2.3
Commit: b3726dadcf2997f20231873ec6e057dba433ae64
Parents: 9cf375f
Author: Saisai Shao 
Authored: Sun Jul 15 01:56:00 2018 +
Committer: Saisai Shao 
Committed: Sun Jul 15 01:56:00 2018 +

--
 R/pkg/DESCRIPTION | 2 +-
 assembly/pom.xml  | 2 +-
 common/kvstore/pom.xml| 2 +-
 common/network-common/pom.xml | 2 +-
 common/network-shuffle/pom.xml| 2 +-
 common/network-yarn/pom.xml   | 2 +-
 common/sketch/pom.xml | 2 +-
 common/tags/pom.xml   | 2 +-
 common/unsafe/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 docs/_config.yml  | 4 ++--
 examples/pom.xml  | 2 +-
 external/docker-integration-tests/pom.xml | 2 +-
 external/flume-assembly/pom.xml   | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-0-10-assembly/pom.xml  | 2 +-
 external/kafka-0-10-sql/pom.xml   | 2 +-
 external/kafka-0-10/pom.xml   | 2 +-
 external/kafka-0-8-assembly/pom.xml   | 2 +-
 external/kafka-0-8/pom.xml| 2 +-
 external/kinesis-asl-assembly/pom.xml | 2 +-
 external/kinesis-asl/pom.xml  | 2 +-
 external/spark-ganglia-lgpl/pom.xml   | 2 +-
 graphx/pom.xml| 2 +-
 hadoop-cloud/pom.xml  | 2 +-
 launcher/pom.xml  | 2 +-
 mllib-local/pom.xml   | 2 +-
 mllib/pom.xml | 2 +-
 pom.xml   | 2 +-
 python/pyspark/version.py | 2 +-
 repl/pom.xml  | 2 +-
 resource-managers/kubernetes/core/pom.xml | 2 +-
 resource-managers/mesos/pom.xml   | 2 +-
 resource-managers/yarn/pom.xml| 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 41 files changed, 42 insertions(+), 42 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/b3726dad/R/pkg/DESCRIPTION
--
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 6ec4966..8df2635 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
@@ -1,6 +1,6 @@
 Package: SparkR
 Type: Package
-Version: 2.3.3
+Version: 2.3.2
 Title: R Frontend for Apache Spark
 Description: Provides an R Frontend for Apache Spark.
 Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "cre"),

http://git-wip-us.apache.org/repos/asf/spark/blob/b3726dad/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index f8b15cc..57485fc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/b3726dad/common/kvstore/pom.xml
--
diff --git a/common/kvstore/pom.xml b/common/kvstore/pom.xml
index e412a47..53e58c2 100644
--- a/common/kvstore/pom.xml
+++ b/common/kvstore/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/b3726dad/common/network-common/pom.xml
--
diff --git a/common/network-common/pom.xml b/common/network-common/pom.xml
index d8f9a3d..d05647c 100644
--- a/common/network-common/pom.xml
+++ b/common/network-common/pom.xml
@@ -22,7 +22,7 @@
   
 org.apache.spark
 spark-parent_2.11
-2.3.3-SNAPSHOT
+2.3.2
 ../../pom.xml
   
 

http://git-wip-us.apache.org/repos/asf/spark/blob/b3726dad/common/network-shuffle/pom.xml
--
diff --git a/common/network-shuffle/pom.xml b/common/network-shuffle/pom.xml
index a1a4f87..8d46761 100644
--- a/common/network-shuffle/pom.xml
+++ b/common/network-shuffle/pom.xml

svn commit: r28118 - /dev/spark/v2.3.2-rc3-bin/

Author: jshao
Date: Sun Jul 15 03:04:30 2018
New Revision: 28118

Log:
Apache Spark v2.3.2-rc3

Added:
dev/spark/v2.3.2-rc3-bin/
dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz   (with props)
dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.sha512
dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz   (with props)
dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.sha512
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-hadoop2.6.tgz   (with props)
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-hadoop2.6.tgz.asc
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-hadoop2.6.tgz.sha512
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-hadoop2.7.tgz   (with props)
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-hadoop2.7.tgz.asc
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-hadoop2.7.tgz.sha512
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-without-hadoop.tgz   (with props)
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-without-hadoop.tgz.asc
dev/spark/v2.3.2-rc3-bin/spark-2.3.2-bin-without-hadoop.tgz.sha512
dev/spark/v2.3.2-rc3-bin/spark-2.3.2.tgz   (with props)
dev/spark/v2.3.2-rc3-bin/spark-2.3.2.tgz.asc
dev/spark/v2.3.2-rc3-bin/spark-2.3.2.tgz.sha512

Added: dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.asc
==
--- dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.asc (added)
+++ dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.asc Sun Jul 15 03:04:30 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIcBAABCgAGBQJbSrCkAAoJENsLIaASlz/QxHIP/2+V6MiBci4mepIYNEA5M4A8
+n5hRYXbPDkK6i/tPlCtvdeW8XkcSJejzk+lPJdjQCxfqOsWCiGal42siV7XY/x96
++08b2XXzLOZ65RHSMJdE3M+qXgEs+kthCSg2Q/mZcssu83BqwNah0JTUIKi3oSmz
+11EnY1Pie4VCUn/ASdUPvmWeDTYpuziZnekMjI9B6WFx/gXHBOz8+6gJTpq6Eyq1
+VKrYCMtMtN6mYXh0yYqtYIXTQgA4/DJsmt2BVDrKOWweHkua3hBDNNBQWxB3kR2l
+tXawlYtIUxOERKL0lwatDqMoXIj7euEs0EfEPaZrYuulGN3s4yrwcOAuOo2Am5uA
+ltCFxlDwVulXPPMbkhv2RIQ4wGjSuMdW6mq94DJCG2SaE5HYgI7yh1MC+iyRu3Ib
+Y4xyQPApEj5HcoL0N/HX2FtcZAcf13CFqsfc6jZ+CWLT6xW57LO/mupn84jPHgao
+3s2d5l6c5uc3b5vZCmcpI9uy2B4Ts2W4Q39xKqlm6BEARqXDhYKQH09mLTsgLb9K
+xdFNKjrb99nSu1yqkJXrw9B95oaPPIGiIPmklfcxLcJ1sHiej7qmmuw511MJOhlJ
+Czngh9HPepjxIO0j3LaH0yT18gj7qy+Y9cqd2YD1LPnaYu3kIjgp8oP2draMwlxa
+/Z+tLGY+16MsFghl3uQe
+=FJmQ
+-END PGP SIGNATURE-

Added: dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.sha512
==
--- dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.sha512 (added)
+++ dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.sha512 Sun Jul 15 03:04:30 2018
@@ -0,0 +1,3 @@
+SparkR_2.3.2.tar.gz: 5D5225F0 8C8E27C8 579DABC7 5CDF37C8 024F2DE2 069583E9
+ 843781A9 30B501AF C72924AD C82DA242 2017D86A 26D0CE9C
+ 4F1BFDDB B35D7FB2 42F2A6C1 055EA0E8

Added: dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.asc
==
--- dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.asc (added)
+++ dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.asc Sun Jul 15 03:04:30 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIcBAABCgAGBQJbSrUeAAoJENsLIaASlz/QqWAP/i1Z/e5HINjqYcVYOjMHFVv2
+h50ezR7aL4sKFvqh5POaNDoXv3GP+4cRsdlOziLBEV2JxnE+Dnu8H4L8Y+SdH52x
+sC+8XbNzBGOJSh0XYQ4Ez8LnMlhCro4n5RgaJJIsRRzbuFvSKakNGL7lW1kxUHQX
+DiqXJ+wA5oRLYeITGE8YLjLgYgPwE8oC92WHmi/RWg5ES6dGmzF09X+7ccAmfHxt
+zOE6ARtLRJ2aeMw9s0t2DLfSznP8dsNXDz0xPHggWdJNmhLrkQfeBN5AZCPq8hwS
+manPzxX9Gb8UFjkRnljds+rGuVW29zVAmWL7rfi8Uv3QSv9oP5ZHyWbiYMDEI2/v
+R/EzwT/Gjk1NWk+W5RGVevNMJ/xMy0XfVdlzrkc8Svi8m93ojhxyJMNhVQdFN2PJ
+rJoqHHSH8ev4/W1GnW4oUwr06dAewTsmLOa/tSvdVjEk0BwNrsM6GwPOTZB3tPB6
+unfDdgOEvlAq2CRN9GnKftAruprk77fmE4frE10sg5Jms1ohLN503NTJx4gTBW7Y
+qCkdYrusPgmW8F6aA865jUylPg/BhIRXE5H4kPnnzgwNYzPjGYdMnuksb3Ls9Mr9
+BxvbkLToe9goSIjGbj+iy/5VGem7tKsKwiDlUN5StAqH+M3ZlEW2hpcILY4cM/VH
+KthInMLfa7Ofa9bEZMb7
+=jTWe
+-END PGP SIGNATURE-

Added: dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.sha512
==
--- dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.sha512 (added)
+++ dev/spark/v2.3.2-rc3-bin/pyspark-2.3.2.tar.gz.sha512 Sun Jul 15 03:04:30 
2018
@@ -0,0 +1,3

  1   2   >