Repository: spark
Updated Branches:
refs/heads/branch-1.1 40bce6350 -> 03d4097bc
[SPARK-5691] Fixing wrong data structure lookup for dupe app registration
In Master's registerApplication method, it checks if the application had
already registered by examining the addressToWorker hash map. In r
ant in your patch #4168.
Author: Andrew Or
Closes #4483 from andrewor14/da-negative and squashes the following commits:
53ed955 [Andrew Or] Throw IllegalArgumentException instead
0e89fd5 [Andrew Or] Check against negative requests
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Com
lso relevant in your patch #4168.
Author: Andrew Or
Closes #4483 from andrewor14/da-negative and squashes the following commits:
53ed955 [Andrew Or] Throw IllegalArgumentException instead
0e89fd5 [Andrew Or] Check against negative requests
Project: http://git-wip-us.apache.org/repos/asf/spark/r
lso relevant in your patch #4168.
Author: Andrew Or
Closes #4483 from andrewor14/da-negative and squashes the following commits:
53ed955 [Andrew Or] Throw IllegalArgumentException instead
0e89fd5 [Andrew Or] Check against negative requests
(cherry picked from com
Repository: spark
Updated Branches:
refs/heads/master 36c4e1d75 -> 20a601310
http://git-wip-us.apache.org/repos/asf/spark/blob/20a60131/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
--
diff --git
a/ya
[SPARK-2996] Implement userClassPathFirst for driver, yarn.
Yarn's config option `spark.yarn.user.classpath.first` does not work the same
way as
`spark.files.userClassPathFirst`; Yarn's version is a lot more dangerous, in
that it
modifies the system classpath, instead of restricting the changes
Repository: spark
Updated Branches:
refs/heads/branch-1.3 ebf1df03d -> 6a1e0f967
http://git-wip-us.apache.org/repos/asf/spark/blob/6a1e0f96/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
--
diff --git
[SPARK-2996] Implement userClassPathFirst for driver, yarn.
Yarn's config option `spark.yarn.user.classpath.first` does not work the same
way as
`spark.files.userClassPathFirst`; Yarn's version is a lot more dangerous, in
that it
modifies the system classpath, instead of restricting the changes
ata` with an empty `stageIds` list. However, later in `AllJobsPage` we
call `stageIds.max`. If this is empty, it will throw an exception.
This crashed my history server.
Author: Andrew Or
Closes #4490 from andrewor14/jobs-page-max and squashes the following commits:
21797d3 [Andrew Or] Check nonEm
ata` with an empty `stageIds` list. However, later in `AllJobsPage` we
call `stageIds.max`. If this is empty, it will throw an exception.
This crashed my history server.
Author: Andrew Or
Closes #4490 from andrewor14/jobs-page-max and squashes the following commits:
21797d3 [Andrew Or] Ch
ata` with an empty `stageIds` list. However, later in `AllJobsPage` we
call `stageIds.max`. If this is empty, it will throw an exception.
This crashed my history server.
Author: Andrew Or
Closes #4490 from andrewor14/jobs-page-max and squashes the following commits:
21797d3 [Andrew Or] Ch
Repository: spark
Updated Branches:
refs/heads/branch-1.3 832625509 -> 6ddbca494
[SPARK-5701] Only set ShuffleReadMetrics when task has shuffle deps
The updateShuffleReadMetrics method in TaskMetrics (called by the executor
heartbeater) will currently always add a ShuffleReadMetrics to TaskMe
Repository: spark
Updated Branches:
refs/heads/master a95ed5215 -> a2d33d0b0
[SPARK-5701] Only set ShuffleReadMetrics when task has shuffle deps
The updateShuffleReadMetrics method in TaskMetrics (called by the executor
heartbeater) will currently always add a ShuffleReadMetrics to TaskMetric
Repository: spark
Updated Branches:
refs/heads/master c7ad80ae4 -> 69bc3bb6c
SPARK-4136. Under dynamic allocation, cancel outstanding executor requests when
no longer needed
This takes advantage of the changes made in SPARK-4337 to cancel pending
requests to YARN when they are no longer need
Repository: spark
Updated Branches:
refs/heads/branch-1.3 e508237f3 -> e53da215c
SPARK-4136. Under dynamic allocation, cancel outstanding executor requests when
no longer needed
This takes advantage of the changes made in SPARK-4337 to cancel pending
requests to YARN when they are no longer
Repository: spark
Updated Branches:
refs/heads/master 69bc3bb6c -> b640c841f
[HOTFIX][SPARK-4136] Fix compilation and tests
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b640c841
Tree: http://git-wip-us.apache.org/repos/
Repository: spark
Updated Branches:
refs/heads/branch-1.3 8b7587af8 -> 4e3aa680b
[HOTFIX][SPARK-4136] Fix compilation and tests
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4e3aa680
Tree: http://git-wip-us.apache.org/re
Repository: spark
Updated Branches:
refs/heads/master 7e24249af -> 1cb377007
[SPARK-4879] Use driver to coordinate Hadoop output committing for speculative
tasks
Previously, SparkHadoopWriter always committed its tasks without question. The
problem is that when speculation is enabled sometim
Repository: spark
Updated Branches:
refs/heads/branch-1.3 e477e91e3 -> 79cd59cde
[SPARK-4879] Use driver to coordinate Hadoop output committing for speculative
tasks
Previously, SparkHadoopWriter always committed its tasks without question. The
problem is that when speculation is enabled som
the
wrong prefix. This is a one line fix. I am will add more comprehensive tests in
a separate patch.
Author: Andrew Or
Closes #4518 from andrewor14/rest-npe and squashes the following commits:
16b15bc [Andrew Or] Correct ErrorServlet context prefix
(cherry picked from com
ong prefix. This is a one line fix. I am will add more comprehensive tests in
a separate patch.
Author: Andrew Or
Closes #4518 from andrewor14/rest-npe and squashes the following commits:
16b15bc [Andrew Or] Correct ErrorServlet context prefix
Project: http://git-wip-us.apache.org/repos/asf/sp
Repository: spark
Updated Branches:
refs/heads/master aa4ca8b87 -> 893d6fd70
[SPARK-5645] Added local read bytes/time to task metrics
ksakellis I stumbled on your JIRA for this yesterday; I know it's assigned to
you but I'd already done this for my own uses a while ago so thought I could
hel
Repository: spark
Updated Branches:
refs/heads/branch-1.3 e3a975d45 -> 74f34bb8b
[SPARK-5645] Added local read bytes/time to task metrics
ksakellis I stumbled on your JIRA for this yesterday; I know it's assigned to
you but I'd already done this for my own uses a while ago so thought I could
Repository: spark
Updated Branches:
refs/heads/master 893d6fd70 -> 9c8076502
[EC2] Update default Spark version to 1.2.1
Author: Katsunori Kanda
Closes #4566 from potix2/ec2-update-version-1-2-1 and squashes the following
commits:
77e7840 [Katsunori Kanda] [EC2] Update default Spark versio
Repository: spark
Updated Branches:
refs/heads/branch-1.3 74f34bb8b -> 9a1de4b20
[SPARK-5765][Examples]Fixed word split problem in run-example and
compute-classpath
Author: Venkata Ramana G
Author: Venkata Ramana Gollamudi
Closes #4561 from gvramana/word_split and squashes the following c
Repository: spark
Updated Branches:
refs/heads/branch-1.2 64254eeec -> b78a686eb
[SPARK-5765][Examples]Fixed word split problem in run-example and
compute-classpath
Author: Venkata Ramana G
Author: Venkata Ramana Gollamudi
Closes #4561 from gvramana/word_split and squashes the following c
Repository: spark
Updated Branches:
refs/heads/master 9c8076502 -> 629d0143e
[SPARK-5765][Examples]Fixed word split problem in run-example and
compute-classpath
Author: Venkata Ramana G
Author: Venkata Ramana Gollamudi
Closes #4561 from gvramana/word_split and squashes the following commi
Repository: spark
Updated Branches:
refs/heads/branch-1.3 9a1de4b20 -> 0040fc509
[SPARK-5762] Fix shuffle write time for sort-based shuffle
mateiz was excluding the time to write this final file from the shuffle write
time intentional?
Author: Kay Ousterhout
Closes #4559 from kayousterhout
Repository: spark
Updated Branches:
refs/heads/master 629d0143e -> 47c73d410
[SPARK-5762] Fix shuffle write time for sort-based shuffle
mateiz was excluding the time to write this final file from the shuffle write
time intentional?
Author: Kay Ousterhout
Closes #4559 from kayousterhout/SPA
Repository: spark
Updated Branches:
refs/heads/branch-1.2 b78a686eb -> 9c5454d06
[SPARK-5762] Fix shuffle write time for sort-based shuffle
mateiz was excluding the time to write this final file from the shuffle write
time intentional?
Author: Kay Ousterhout
Closes #4559 from kayousterhout
hor: Andrew Or
Closes #4557 from andrewor14/rest-tests and squashes the following commits:
b4dc980 [Andrew Or] Merge branch 'master' of github.com:apache/spark into
rest-tests
b55e40f [Andrew Or] Add test for unknown fields
cc96993 [Andrew Or] private[spark] -> private[rest]
578cf45 [And
rew Or
Closes #4557 from andrewor14/rest-tests and squashes the following commits:
b4dc980 [Andrew Or] Merge branch 'master' of github.com:apache/spark into
rest-tests
b55e40f [Andrew Or] Add test for unknown fields
cc96993 [Andrew Or] private[spark] -> private[rest]
578cf45 [Andrew O
Repository: spark
Updated Branches:
refs/heads/branch-1.3 11d108030 -> 02d5b32bb
[SPARK-5759][Yarn]ExecutorRunnable should catch YarnException while NMClient
start contain...
some time since some reasons, it lead to some exception while NMClient start
some containers.example:we do not config
Repository: spark
Updated Branches:
refs/heads/master 1d5663e92 -> 947b8bd82
[SPARK-5759][Yarn]ExecutorRunnable should catch YarnException while NMClient
start contain...
some time since some reasons, it lead to some exception while NMClient start
some containers.example:we do not config spa
Repository: spark
Updated Branches:
refs/heads/master 947b8bd82 -> 26c816e73
SPARK-5747: Fix wordsplitting bugs in make-distribution.sh
The `$MVN` command variable may have spaces, so when referring to it, must wrap
in quotes.
Author: David Y. Ross
Closes #4540 from dyross/dyr-fix-make-dis
Repository: spark
Updated Branches:
refs/heads/branch-1.3 02d5b32bb -> 11a0d5b6d
SPARK-5747: Fix wordsplitting bugs in make-distribution.sh
The `$MVN` command variable may have spaces, so when referring to it, must wrap
in quotes.
Author: David Y. Ross
Closes #4540 from dyross/dyr-fix-make
Repository: spark
Updated Branches:
refs/heads/branch-1.3 11a0d5b6d -> bf0d15c52
[SPARK-5780] [PySpark] Mute the logging during unit tests
There a bunch of logging coming from driver and worker, it's noisy and scaring,
and a lots of exception in it, people are confusing about the tests are fa
Repository: spark
Updated Branches:
refs/heads/branch-1.2 9c5454d06 -> c7bac577a
[SPARK-5780] [PySpark] Mute the logging during unit tests
There a bunch of logging coming from driver and worker, it's noisy and scaring,
and a lots of exception in it, people are confusing about the tests are fa
Repository: spark
Updated Branches:
refs/heads/master 26c816e73 -> 0bf031582
[SPARK-5780] [PySpark] Mute the logging during unit tests
There a bunch of logging coming from driver and worker, it's noisy and scaring,
and a lots of exception in it, people are confusing about the tests are failin
Repository: spark
Updated Branches:
refs/heads/branch-1.2 d24971a62 -> 0ba065f0a
Revert "[SPARK-5762] Fix shuffle write time for sort-based shuffle"
This reverts commit 9c5454d06e56917521a15697c36f76a33a94dd1e.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip
Repository: spark
Updated Branches:
refs/heads/master 1768bd514 -> c0ccd2564
[SPARK-5732][CORE]:Add an option to print the spark version in spark script.
Naturally, we may need to add an option to print the spark version in spark
script. It is pretty common in script tool.
![9](https://cloud.
Repository: spark
Updated Branches:
refs/heads/branch-1.3 1255e83f8 -> 5c883df09
[SPARK-5732][CORE]:Add an option to print the spark version in spark script.
Naturally, we may need to add an option to print the spark version in spark
script. It is pretty common in script tool.
![9](https://cl
Repository: spark
Updated Branches:
refs/heads/master e1a1ff810 -> fc6d3e796
[SPARK-5783] Better eventlog-parsing error messages
Author: Ryan Williams
Closes #4573 from ryan-williams/history and squashes the following commits:
a8647ec [Ryan Williams] fix test calls to .replay()
98aa3fe [Rya
Repository: spark
Updated Branches:
refs/heads/branch-1.3 5e6394222 -> e5690a502
[SPARK-5783] Better eventlog-parsing error messages
Author: Ryan Williams
Closes #4573 from ryan-williams/history and squashes the following commits:
a8647ec [Ryan Williams] fix test calls to .replay()
98aa3fe
Repository: spark
Updated Branches:
refs/heads/branch-1.3 e5690a502 -> cc9eec1a0
[SPARK-5735] Replace uses of EasyMock with Mockito
This patch replaces all uses of EasyMock with Mockito. There are two
motivations for this:
1. We should use a single mocking framework in our tests in order to
Repository: spark
Updated Branches:
refs/heads/master fc6d3e796 -> 077eec2d9
[SPARK-5735] Replace uses of EasyMock with Mockito
This patch replaces all uses of EasyMock with Mockito. There are two
motivations for this:
1. We should use a single mocking framework in our tests in order to kee
Repository: spark
Updated Branches:
refs/heads/master 378c7eb0d -> 5d3cc6b3d
[HOTFIX] Fix build break in MesosSchedulerBackendSuite
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5d3cc6b3
Tree: http://git-wip-us.apache.or
Repository: spark
Updated Branches:
refs/heads/branch-1.3 ad731897b -> 41603717a
[HOTFIX] Fix build break in MesosSchedulerBackendSuite
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/41603717
Tree: http://git-wip-us.apach
Repository: spark
Updated Branches:
refs/heads/master b1bd1dd32 -> 16687651f
[SPARK-3340] Deprecate ADD_JARS and ADD_FILES
I created a patch that disables the environment variables.
Thereby scala or python shell log a warning message to notify user about the
deprecation
with the following mes
Repository: spark
Updated Branches:
refs/heads/branch-1.3 c2a9a6176 -> d8c70fb6d
[SPARK-3340] Deprecate ADD_JARS and ADD_FILES
I created a patch that disables the environment variables.
Thereby scala or python shell log a warning message to notify user about the
deprecation
with the following
Repository: spark
Updated Branches:
refs/heads/branch-1.3 d8c70fb6d -> 385a339a2
[SPARK-5849] Handle more types of invalid JSON requests in
SubmitRestProtocolMessage.parseAction
This patch improves SubmitRestProtocol's handling of invalid JSON requests in
cases where those requests were pars
Repository: spark
Updated Branches:
refs/heads/master 16687651f -> 58a82a788
[SPARK-5849] Handle more types of invalid JSON requests in
SubmitRestProtocolMessage.parseAction
This patch improves SubmitRestProtocol's handling of invalid JSON requests in
cases where those requests were parsable
Repository: spark
Updated Branches:
refs/heads/master e945aa613 -> fb87f4492
SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2
Author: Jacek Lewandowski
Closes #4653 from jacek-lewandowski/SPARK-5548-2-master and squashes the
following commits:
0e199b6 [Jacek Lewandowski] SPARK-5548:
Repository: spark
Updated Branches:
refs/heads/branch-1.3 092b45f69 -> fbcb949c5
SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2
Author: Jacek Lewandowski
Closes #4653 from jacek-lewandowski/SPARK-5548-2-master and squashes the
following commits:
0e199b6 [Jacek Lewandowski] SPARK-55
the very least add a huge
warning where appropriate.
Author: Andrew Or
Closes #4687 from andrewor14/driver-wrapper-warning and squashes the following
commits:
7989b56 [Andrew Or] Add huge compatibility warning
(cherry picked from commit 38e624a732b18e01ad2e7a499ce0bb0d7acdcdf6)
Signed-off
ery least add a huge
warning where appropriate.
Author: Andrew Or
Closes #4687 from andrewor14/driver-wrapper-warning and squashes the following
commits:
7989b56 [Andrew Or] Add huge compatibility warning
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-
Repository: spark
Updated Branches:
refs/heads/master 38e624a73 -> 90095bf3c
[SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure
deleting the temp file
This PR adds a `finalize` method in DiskMapIterator to clean up the resources
even if some exception happens during p
Repository: spark
Updated Branches:
refs/heads/branch-1.3 f93d4d992 -> 25fae8e7e
[SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure
deleting the temp file
This PR adds a `finalize` method in DiskMapIterator to clean up the resources
even if some exception happens duri
Repository: spark
Updated Branches:
refs/heads/branch-1.1 651ceaeb3 -> 36f3c499f
[SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure
deleting the temp file
This PR adds a `finalize` method in DiskMapIterator to clean up the resources
even if some exception happens duri
Repository: spark
Updated Branches:
refs/heads/branch-1.2 f6ee80b18 -> 61bde0049
[SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure
deleting the temp file
This PR adds a `finalize` method in DiskMapIterator to clean up the resources
even if some exception happens duri
Repository: spark
Updated Branches:
refs/heads/branch-1.3 25fae8e7e -> fe00eb66e
[SPARK-5825] [Spark Submit] Remove the double checking instance name when
stopping the service
`spark-daemon.sh` will confirm the process id by fuzzy matching the class name
while stopping the service, however,
Repository: spark
Updated Branches:
refs/heads/branch-1.2 61bde0049 -> 856fdcb65
[SPARK-5825] [Spark Submit] Remove the double checking instance name when
stopping the service
`spark-daemon.sh` will confirm the process id by fuzzy matching the class name
while stopping the service, however,
Repository: spark
Updated Branches:
refs/heads/master 90095bf3c -> 94cdb05ff
[SPARK-5825] [Spark Submit] Remove the double checking instance name when
stopping the service
`spark-daemon.sh` will confirm the process id by fuzzy matching the class name
while stopping the service, however, it w
SPARK-4682 [CORE] Consolidate various 'Clock' classes
Another one from JoshRosen 's wish list. The first commit is much smaller and
removes 2 of the 4 Clock classes. The second is much larger, necessary for
consolidating the streaming one. I put together implementations in the way that
seemed s
SPARK-4682 [CORE] Consolidate various 'Clock' classes
Another one from JoshRosen 's wish list. The first commit is much smaller and
removes 2 of the 4 Clock classes. The second is much larger, necessary for
consolidating the streaming one. I put together implementations in the way that
seemed s
Repository: spark
Updated Branches:
refs/heads/master ad6b169de -> 34b7c3538
http://git-wip-us.apache.org/repos/asf/spark/blob/34b7c353/streaming/src/test/scala/org/apache/spark/streaming/ReceivedBlockHandlerSuite.scala
--
diff
Repository: spark
Updated Branches:
refs/heads/branch-1.3 ff8976ec7 -> bd49e8b96
http://git-wip-us.apache.org/repos/asf/spark/blob/bd49e8b9/streaming/src/test/scala/org/apache/spark/streaming/ReceivedBlockHandlerSuite.scala
--
Repository: spark
Updated Branches:
refs/heads/branch-1.3 bd49e8b96 -> c5f3b9e02
SPARK-5570: No docs stating that `new SparkConf().set("spark.driver.memory",
...) will not work
I've updated documentation to reflect true behavior of this setting in client
vs. cluster mode.
Author: Ilya Ganel
Repository: spark
Updated Branches:
refs/heads/master 34b7c3538 -> 6bddc4035
SPARK-5570: No docs stating that `new SparkConf().set("spark.driver.memory",
...) will not work
I've updated documentation to reflect true behavior of this setting in client
vs. cluster mode.
Author: Ilya Ganelin
Repository: spark
Updated Branches:
refs/heads/branch-1.3 ba941ceb1 -> 0382dcc0a
[SPARK-4808] Removing minimum number of elements read before spill check
In the general case, Spillable's heuristic of checking for memory stress
on every 32nd item after 1000 items are read is good enough. In gen
Repository: spark
Updated Branches:
refs/heads/master 0cfd2cebd -> 3be92cdac
[SPARK-4808] Removing minimum number of elements read before spill check
In the general case, Spillable's heuristic of checking for memory stress
on every 32nd item after 1000 items are read is good enough. In general
Repository: spark
Updated Branches:
refs/heads/branch-1.2 18fbed5b5 -> 5cea859fd
[SPARK-4808] Removing minimum number of elements read before spill check
In the general case, Spillable's heuristic of checking for memory stress
on every 32nd item after 1000 items are read is good enough. In gen
Repository: spark
Updated Branches:
refs/heads/master e15532471 -> d3cbd38c3
SPARK-5841 [CORE] [HOTFIX 2] Memory leak in DiskBlockManager
Continue to see IllegalStateException in YARN cluster mode. Adding a simple
workaround for now.
Author: Nishkam Ravi
Author: nishkamravi2
Author: nravi
Repository: spark
Updated Branches:
refs/heads/branch-1.3 b9a6c5c84 -> 932338eda
SPARK-5841 [CORE] [HOTFIX 2] Memory leak in DiskBlockManager
Continue to see IllegalStateException in YARN cluster mode. Adding a simple
workaround for now.
Author: Nishkam Ravi
Author: nishkamravi2
Author: nr
Repository: spark
Updated Branches:
refs/heads/master d3cbd38c3 -> 7138816ab
[SPARK-5937][YARN] Fix ClientSuite to set YARN mode, so that the correct class
is used in t...
...ests.
Without this SparkHadoopUtil is used by the Client instead of
YarnSparkHadoopUtil.
Author: Hari Shreedharan
Repository: spark
Updated Branches:
refs/heads/branch-1.3 932338eda -> 76e3e6527
[SPARK-5937][YARN] Fix ClientSuite to set YARN mode, so that the correct class
is used in t...
...ests.
Without this SparkHadoopUtil is used by the Client instead of
YarnSparkHadoopUtil.
Author: Hari Shreedhar
Repository: spark
Updated Branches:
refs/heads/branch-1.3 eed7389cf -> 4186dd3dd
Revert "[SPARK-4808] Removing minimum number of elements read before spill
check"
This reverts commit 0382dcc0a94f8e619fd11ec2cc0b18459a690c2b.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit:
Repository: spark
Updated Branches:
refs/heads/branch-1.2 71173de7a -> 2c9d9659d
Revert "[SPARK-4808] Removing minimum number of elements read before spill
check"
This reverts commit 5cea859fd27dc6a216fa9d31d293c93407fbff01.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit:
Repository: spark
Updated Branches:
refs/heads/master 201236628 -> 64d2c01ff
[Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds
Patch should be self-explanatory
pwendell JoshRosen
Author: Tathagata Das
Closes #4741 from tdas/SPARK-5967 and squashes the following com
Repository: spark
Updated Branches:
refs/heads/branch-1.2 2c9d9659d -> 3ad00ee1c
[Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds
Patch should be self-explanatory
pwendell JoshRosen
Author: Tathagata Das
Closes #4741 from tdas/SPARK-5967 and squashes the following
Repository: spark
Updated Branches:
refs/heads/branch-1.3 e46096b1e -> 28dd53b1b
[Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds
Patch should be self-explanatory
pwendell JoshRosen
Author: Tathagata Das
Closes #4741 from tdas/SPARK-5967 and squashes the following
ses #4739 from andrewor14/user-jar-blocker and squashes the following
commits:
23c4a9e [Andrew Or] Use right argument
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6d2caa57
Tree: http://git-wip-us.apache.org/repos/asf/spark/t
Or
Closes #4739 from andrewor14/user-jar-blocker and squashes the following
commits:
23c4a9e [Andrew Or] Use right argument
(cherry picked from commit 6d2caa576fcdc5c848d1472b09c685b3871e220e)
Signed-off-by: Andrew Or
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-
Repository: spark
Updated Branches:
refs/heads/master 7fa960e65 -> cd5c8d7bb
SPARK-4704 [CORE] SparkSubmitDriverBootstrap doesn't flush output
Join on output threads to make sure any lingering output from process reaches
stdout, stderr before exiting
CC andrewor14 since I believe he
Repository: spark
Updated Branches:
refs/heads/branch-1.2 cc7313d09 -> 602d5c1fc
SPARK-4704 [CORE] SparkSubmitDriverBootstrap doesn't flush output
Join on output threads to make sure any lingering output from process reaches
stdout, stderr before exiting
CC andrewor14 since I be
Repository: spark
Updated Branches:
refs/heads/master cd5c8d7bb -> 10094a523
Modify default value description for
spark.scheduler.minRegisteredResourcesRatio on docs.
The configuration is not supported in mesos mode now.
See https://github.com/apache/spark/pull/1462
Author: Li Zhihui
Close
Repository: spark
Updated Branches:
refs/heads/branch-1.2 602d5c1fc -> 94faf4c49
Modify default value description for
spark.scheduler.minRegisteredResourcesRatio on docs.
The configuration is not supported in mesos mode now.
See https://github.com/apache/spark/pull/1462
Author: Li Zhihui
C
Repository: spark
Updated Branches:
refs/heads/branch-1.3 5d309ad6c -> 62652dc5b
Modify default value description for
spark.scheduler.minRegisteredResourcesRatio on docs.
The configuration is not supported in mesos mode now.
See https://github.com/apache/spark/pull/1462
Author: Li Zhihui
C
Repository: spark
Updated Branches:
refs/heads/master 10094a523 -> 8942b522d
[SPARK-3562]Periodic cleanup event logs
Author: xukun 00228947
Closes #4214 from viper-kun/cleaneventlog and squashes the following commits:
7a5b9c5 [xukun 00228947] fix issue
31674ee [xukun 00228947] fix issue
6e3
Repository: spark
Updated Branches:
refs/heads/master 8942b522d -> aa63f633d
[SPARK-6027][SPARK-5546] Fixed --jar and --packages not working for KafkaUtils
and improved error message
The problem with SPARK-6027 in short is that JARs like the kafka-assembly.jar
does not work in python as the
Repository: spark
Updated Branches:
refs/heads/branch-1.3 62652dc5b -> 731a997db
[SPARK-6027][SPARK-5546] Fixed --jar and --packages not working for KafkaUtils
and improved error message
The problem with SPARK-6027 in short is that JARs like the kafka-assembly.jar
does not work in python as
Repository: spark
Updated Branches:
refs/heads/master aa63f633d -> 5f3238b3b
[SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM
Author: Cheolsoo Park
Closes #4773 from piaozhexiu/SPARK-6018 and squashes the following commits:
2a919d5 [Cheolsoo Park] Rename e with cau
Repository: spark
Updated Branches:
refs/heads/branch-1.3 731a997db -> fe7967483
[SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM
Author: Cheolsoo Park
Closes #4773 from piaozhexiu/SPARK-6018 and squashes the following commits:
2a919d5 [Cheolsoo Park] Rename e with
Repository: spark
Updated Branches:
refs/heads/branch-1.2 94faf4c49 -> e21475d16
[SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM
Author: Cheolsoo Park
Closes #4773 from piaozhexiu/SPARK-6018 and squashes the following commits:
2a919d5 [Cheolsoo Park] Rename e with
Repository: spark
Updated Branches:
refs/heads/branch-1.2 2d83442f2 -> 64e0cbc73
SPARK-4300 [CORE] Race condition during SparkWorker shutdown
Close appender saving stdout/stderr before destroying process to avoid
exception on reading closed input stream.
(This also removes a redundant `waitFo
Repository: spark
Updated Branches:
refs/heads/master 5f3238b3b -> 3fb53c029
SPARK-4300 [CORE] Race condition during SparkWorker shutdown
Close appender saving stdout/stderr before destroying process to avoid
exception on reading closed input stream.
(This also removes a redundant `waitFor()`
Repository: spark
Updated Branches:
refs/heads/branch-1.3 fe7967483 -> 297c3ef82
Add a note for context termination for History server on Yarn
The history server on Yarn only shows completed jobs. This adds a note
concerning the needed explicit context termination at the end of a spark job
w
Repository: spark
Updated Branches:
refs/heads/branch-1.2 64e0cbc73 -> 58b3aa692
Add a note for context termination for History server on Yarn
The history server on Yarn only shows completed jobs. This adds a note
concerning the needed explicit context termination at the end of a spark job
w
Repository: spark
Updated Branches:
refs/heads/branch-1.0 f74bccbe3 -> 14e042b65
Add a note for context termination for History server on Yarn
The history server on Yarn only shows completed jobs. This adds a note
concerning the needed explicit context termination at the end of a spark job
w
Repository: spark
Updated Branches:
refs/heads/branch-1.1 36f3c499f -> 2785210fa
Add a note for context termination for History server on Yarn
The history server on Yarn only shows completed jobs. This adds a note
concerning the needed explicit context termination at the end of a spark job
w
701 - 800 of 1712 matches
Mail list logo