Repository: spark
Updated Branches:
refs/heads/branch-2.0 7d63c36e1 -> ccb53a20e
Fix reference to external metrics documentation
Author: Ben McCann
Closes #12833 from benmccann/patch-1.
(cherry picked from commit 214d1be4fd4a34399b6a2adb2618784de459a48d)
Repository: spark
Updated Branches:
refs/heads/branch-2.0 705172202 -> 7d63c36e1
[SPARK-15049] Rename NewAccumulator to AccumulatorV2
## What changes were proposed in this pull request?
NewAccumulator isn't the best name if we ever come up with v3 of the API.
## How was this patch tested?
Repository: spark
Updated Branches:
refs/heads/master a832cef11 -> 44da8d8ea
[SPARK-15049] Rename NewAccumulator to AccumulatorV2
## What changes were proposed in this pull request?
NewAccumulator isn't the best name if we ever come up with v3 of the API.
## How was this patch tested?
Repository: spark
Updated Branches:
refs/heads/branch-2.0 a6428292f -> 705172202
[SPARK-13425][SQL] Documentation for CSV datasource options
## What changes were proposed in this pull request?
This PR adds the explanation and documentation for CSV options for reading and
writing.
## How
Repository: spark
Updated Branches:
refs/heads/master a6428292f -> a832cef11
[SPARK-13425][SQL] Documentation for CSV datasource options
## What changes were proposed in this pull request?
This PR adds the explanation and documentation for CSV options for reading and
writing.
## How was
Repository: spark
Updated Branches:
refs/heads/branch-2.0 [created] a6428292f
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Branches:
refs/heads/master cdf9e9753 -> a6428292f
[SPARK-14931][ML][PYTHON] Mismatched default values between pipelines in Spark
and PySpark - update
## What changes were proposed in this pull request?
This PR is an update for
Repository: spark
Updated Branches:
refs/heads/master 90787de86 -> cdf9e9753
[SPARK-14505][CORE] Fix bug : creating two SparkContext objects in the same
jvm, the first one will can not run any task!
After creating two SparkContext objects in the same jvm(the second one can not
be created