spark git commit: [SPARK-5691] Fixing wrong data structure lookup for dupe app registration

2015-02-09 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.1 40bce6350 -> 03d4097bc [SPARK-5691] Fixing wrong data structure lookup for dupe app registration In Master's registerApplication method, it checks if the application had already registered by examining the addressToWorker hash map. In r

spark git commit: [SPARK-5698] Do not let user request negative # of executors

2015-02-09 Thread andrewor14
ant in your patch #4168. Author: Andrew Or Closes #4483 from andrewor14/da-negative and squashes the following commits: 53ed955 [Andrew Or] Throw IllegalArgumentException instead 0e89fd5 [Andrew Or] Check against negative requests Project: http://git-wip-us.apache.org/repos/asf/spark/repo Com

spark git commit: [SPARK-5698] Do not let user request negative # of executors

2015-02-09 Thread andrewor14
lso relevant in your patch #4168. Author: Andrew Or Closes #4483 from andrewor14/da-negative and squashes the following commits: 53ed955 [Andrew Or] Throw IllegalArgumentException instead 0e89fd5 [Andrew Or] Check against negative requests Project: http://git-wip-us.apache.org/repos/asf/spark/r

spark git commit: [SPARK-5698] Do not let user request negative # of executors

2015-02-09 Thread andrewor14
lso relevant in your patch #4168. Author: Andrew Or Closes #4483 from andrewor14/da-negative and squashes the following commits: 53ed955 [Andrew Or] Throw IllegalArgumentException instead 0e89fd5 [Andrew Or] Check against negative requests (cherry picked from com

[1/2] spark git commit: [SPARK-2996] Implement userClassPathFirst for driver, yarn.

2015-02-09 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 36c4e1d75 -> 20a601310 http://git-wip-us.apache.org/repos/asf/spark/blob/20a60131/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala -- diff --git a/ya

[2/2] spark git commit: [SPARK-2996] Implement userClassPathFirst for driver, yarn.

2015-02-09 Thread andrewor14
[SPARK-2996] Implement userClassPathFirst for driver, yarn. Yarn's config option `spark.yarn.user.classpath.first` does not work the same way as `spark.files.userClassPathFirst`; Yarn's version is a lot more dangerous, in that it modifies the system classpath, instead of restricting the changes

[1/2] spark git commit: [SPARK-2996] Implement userClassPathFirst for driver, yarn.

2015-02-09 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 ebf1df03d -> 6a1e0f967 http://git-wip-us.apache.org/repos/asf/spark/blob/6a1e0f96/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala -- diff --git

[2/2] spark git commit: [SPARK-2996] Implement userClassPathFirst for driver, yarn.

2015-02-09 Thread andrewor14
[SPARK-2996] Implement userClassPathFirst for driver, yarn. Yarn's config option `spark.yarn.user.classpath.first` does not work the same way as `spark.files.userClassPathFirst`; Yarn's version is a lot more dangerous, in that it modifies the system classpath, instead of restricting the changes

spark git commit: [SPARK-5703] AllJobsPage throws empty.max exception

2015-02-09 Thread andrewor14
ata` with an empty `stageIds` list. However, later in `AllJobsPage` we call `stageIds.max`. If this is empty, it will throw an exception. This crashed my history server. Author: Andrew Or Closes #4490 from andrewor14/jobs-page-max and squashes the following commits: 21797d3 [Andrew Or] Check nonEm

spark git commit: [SPARK-5703] AllJobsPage throws empty.max exception

2015-02-09 Thread andrewor14
ata` with an empty `stageIds` list. However, later in `AllJobsPage` we call `stageIds.max`. If this is empty, it will throw an exception. This crashed my history server. Author: Andrew Or Closes #4490 from andrewor14/jobs-page-max and squashes the following commits: 21797d3 [Andrew Or] Ch

spark git commit: [SPARK-5703] AllJobsPage throws empty.max exception

2015-02-09 Thread andrewor14
ata` with an empty `stageIds` list. However, later in `AllJobsPage` we call `stageIds.max`. If this is empty, it will throw an exception. This crashed my history server. Author: Andrew Or Closes #4490 from andrewor14/jobs-page-max and squashes the following commits: 21797d3 [Andrew Or] Ch

spark git commit: [SPARK-5701] Only set ShuffleReadMetrics when task has shuffle deps

2015-02-09 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 832625509 -> 6ddbca494 [SPARK-5701] Only set ShuffleReadMetrics when task has shuffle deps The updateShuffleReadMetrics method in TaskMetrics (called by the executor heartbeater) will currently always add a ShuffleReadMetrics to TaskMe

spark git commit: [SPARK-5701] Only set ShuffleReadMetrics when task has shuffle deps

2015-02-09 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master a95ed5215 -> a2d33d0b0 [SPARK-5701] Only set ShuffleReadMetrics when task has shuffle deps The updateShuffleReadMetrics method in TaskMetrics (called by the executor heartbeater) will currently always add a ShuffleReadMetrics to TaskMetric

spark git commit: SPARK-4136. Under dynamic allocation, cancel outstanding executor requests when no longer needed

2015-02-10 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master c7ad80ae4 -> 69bc3bb6c SPARK-4136. Under dynamic allocation, cancel outstanding executor requests when no longer needed This takes advantage of the changes made in SPARK-4337 to cancel pending requests to YARN when they are no longer need

spark git commit: SPARK-4136. Under dynamic allocation, cancel outstanding executor requests when no longer needed

2015-02-10 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 e508237f3 -> e53da215c SPARK-4136. Under dynamic allocation, cancel outstanding executor requests when no longer needed This takes advantage of the changes made in SPARK-4337 to cancel pending requests to YARN when they are no longer

spark git commit: [HOTFIX][SPARK-4136] Fix compilation and tests

2015-02-10 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 69bc3bb6c -> b640c841f [HOTFIX][SPARK-4136] Fix compilation and tests Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b640c841 Tree: http://git-wip-us.apache.org/repos/

spark git commit: [HOTFIX][SPARK-4136] Fix compilation and tests

2015-02-10 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 8b7587af8 -> 4e3aa680b [HOTFIX][SPARK-4136] Fix compilation and tests Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4e3aa680 Tree: http://git-wip-us.apache.org/re

spark git commit: [SPARK-4879] Use driver to coordinate Hadoop output committing for speculative tasks

2015-02-10 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 7e24249af -> 1cb377007 [SPARK-4879] Use driver to coordinate Hadoop output committing for speculative tasks Previously, SparkHadoopWriter always committed its tasks without question. The problem is that when speculation is enabled sometim

spark git commit: [SPARK-4879] Use driver to coordinate Hadoop output committing for speculative tasks

2015-02-10 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 e477e91e3 -> 79cd59cde [SPARK-4879] Use driver to coordinate Hadoop output committing for speculative tasks Previously, SparkHadoopWriter always committed its tasks without question. The problem is that when speculation is enabled som

spark git commit: [SPARK-5729] Potential NPE in standalone REST API

2015-02-10 Thread andrewor14
the wrong prefix. This is a one line fix. I am will add more comprehensive tests in a separate patch. Author: Andrew Or Closes #4518 from andrewor14/rest-npe and squashes the following commits: 16b15bc [Andrew Or] Correct ErrorServlet context prefix (cherry picked from com

spark git commit: [SPARK-5729] Potential NPE in standalone REST API

2015-02-10 Thread andrewor14
ong prefix. This is a one line fix. I am will add more comprehensive tests in a separate patch. Author: Andrew Or Closes #4518 from andrewor14/rest-npe and squashes the following commits: 16b15bc [Andrew Or] Correct ErrorServlet context prefix Project: http://git-wip-us.apache.org/repos/asf/sp

spark git commit: [SPARK-5645] Added local read bytes/time to task metrics

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master aa4ca8b87 -> 893d6fd70 [SPARK-5645] Added local read bytes/time to task metrics ksakellis I stumbled on your JIRA for this yesterday; I know it's assigned to you but I'd already done this for my own uses a while ago so thought I could hel

spark git commit: [SPARK-5645] Added local read bytes/time to task metrics

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 e3a975d45 -> 74f34bb8b [SPARK-5645] Added local read bytes/time to task metrics ksakellis I stumbled on your JIRA for this yesterday; I know it's assigned to you but I'd already done this for my own uses a while ago so thought I could

spark git commit: [EC2] Update default Spark version to 1.2.1

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 893d6fd70 -> 9c8076502 [EC2] Update default Spark version to 1.2.1 Author: Katsunori Kanda Closes #4566 from potix2/ec2-update-version-1-2-1 and squashes the following commits: 77e7840 [Katsunori Kanda] [EC2] Update default Spark versio

spark git commit: [SPARK-5765][Examples]Fixed word split problem in run-example and compute-classpath

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 74f34bb8b -> 9a1de4b20 [SPARK-5765][Examples]Fixed word split problem in run-example and compute-classpath Author: Venkata Ramana G Author: Venkata Ramana Gollamudi Closes #4561 from gvramana/word_split and squashes the following c

spark git commit: [SPARK-5765][Examples]Fixed word split problem in run-example and compute-classpath

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 64254eeec -> b78a686eb [SPARK-5765][Examples]Fixed word split problem in run-example and compute-classpath Author: Venkata Ramana G Author: Venkata Ramana Gollamudi Closes #4561 from gvramana/word_split and squashes the following c

spark git commit: [SPARK-5765][Examples]Fixed word split problem in run-example and compute-classpath

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 9c8076502 -> 629d0143e [SPARK-5765][Examples]Fixed word split problem in run-example and compute-classpath Author: Venkata Ramana G Author: Venkata Ramana Gollamudi Closes #4561 from gvramana/word_split and squashes the following commi

spark git commit: [SPARK-5762] Fix shuffle write time for sort-based shuffle

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 9a1de4b20 -> 0040fc509 [SPARK-5762] Fix shuffle write time for sort-based shuffle mateiz was excluding the time to write this final file from the shuffle write time intentional? Author: Kay Ousterhout Closes #4559 from kayousterhout

spark git commit: [SPARK-5762] Fix shuffle write time for sort-based shuffle

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 629d0143e -> 47c73d410 [SPARK-5762] Fix shuffle write time for sort-based shuffle mateiz was excluding the time to write this final file from the shuffle write time intentional? Author: Kay Ousterhout Closes #4559 from kayousterhout/SPA

spark git commit: [SPARK-5762] Fix shuffle write time for sort-based shuffle

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 b78a686eb -> 9c5454d06 [SPARK-5762] Fix shuffle write time for sort-based shuffle mateiz was excluding the time to write this final file from the shuffle write time intentional? Author: Kay Ousterhout Closes #4559 from kayousterhout

spark git commit: [SPARK-5760][SPARK-5761] Fix standalone rest protocol corner cases + revamp tests

2015-02-12 Thread andrewor14
hor: Andrew Or Closes #4557 from andrewor14/rest-tests and squashes the following commits: b4dc980 [Andrew Or] Merge branch 'master' of github.com:apache/spark into rest-tests b55e40f [Andrew Or] Add test for unknown fields cc96993 [Andrew Or] private[spark] -> private[rest] 578cf45 [And

spark git commit: [SPARK-5760][SPARK-5761] Fix standalone rest protocol corner cases + revamp tests

2015-02-12 Thread andrewor14
rew Or Closes #4557 from andrewor14/rest-tests and squashes the following commits: b4dc980 [Andrew Or] Merge branch 'master' of github.com:apache/spark into rest-tests b55e40f [Andrew Or] Add test for unknown fields cc96993 [Andrew Or] private[spark] -> private[rest] 578cf45 [Andrew O

spark git commit: [SPARK-5759][Yarn]ExecutorRunnable should catch YarnException while NMClient start contain...

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 11d108030 -> 02d5b32bb [SPARK-5759][Yarn]ExecutorRunnable should catch YarnException while NMClient start contain... some time since some reasons, it lead to some exception while NMClient start some containers.example:we do not config

spark git commit: [SPARK-5759][Yarn]ExecutorRunnable should catch YarnException while NMClient start contain...

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 1d5663e92 -> 947b8bd82 [SPARK-5759][Yarn]ExecutorRunnable should catch YarnException while NMClient start contain... some time since some reasons, it lead to some exception while NMClient start some containers.example:we do not config spa

spark git commit: SPARK-5747: Fix wordsplitting bugs in make-distribution.sh

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 947b8bd82 -> 26c816e73 SPARK-5747: Fix wordsplitting bugs in make-distribution.sh The `$MVN` command variable may have spaces, so when referring to it, must wrap in quotes. Author: David Y. Ross Closes #4540 from dyross/dyr-fix-make-dis

spark git commit: SPARK-5747: Fix wordsplitting bugs in make-distribution.sh

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 02d5b32bb -> 11a0d5b6d SPARK-5747: Fix wordsplitting bugs in make-distribution.sh The `$MVN` command variable may have spaces, so when referring to it, must wrap in quotes. Author: David Y. Ross Closes #4540 from dyross/dyr-fix-make

spark git commit: [SPARK-5780] [PySpark] Mute the logging during unit tests

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 11a0d5b6d -> bf0d15c52 [SPARK-5780] [PySpark] Mute the logging during unit tests There a bunch of logging coming from driver and worker, it's noisy and scaring, and a lots of exception in it, people are confusing about the tests are fa

spark git commit: [SPARK-5780] [PySpark] Mute the logging during unit tests

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 9c5454d06 -> c7bac577a [SPARK-5780] [PySpark] Mute the logging during unit tests There a bunch of logging coming from driver and worker, it's noisy and scaring, and a lots of exception in it, people are confusing about the tests are fa

spark git commit: [SPARK-5780] [PySpark] Mute the logging during unit tests

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 26c816e73 -> 0bf031582 [SPARK-5780] [PySpark] Mute the logging during unit tests There a bunch of logging coming from driver and worker, it's noisy and scaring, and a lots of exception in it, people are confusing about the tests are failin

spark git commit: Revert "[SPARK-5762] Fix shuffle write time for sort-based shuffle"

2015-02-12 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 d24971a62 -> 0ba065f0a Revert "[SPARK-5762] Fix shuffle write time for sort-based shuffle" This reverts commit 9c5454d06e56917521a15697c36f76a33a94dd1e. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip

spark git commit: [SPARK-5732][CORE]:Add an option to print the spark version in spark script.

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 1768bd514 -> c0ccd2564 [SPARK-5732][CORE]:Add an option to print the spark version in spark script. Naturally, we may need to add an option to print the spark version in spark script. It is pretty common in script tool. ![9](https://cloud.

spark git commit: [SPARK-5732][CORE]:Add an option to print the spark version in spark script.

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 1255e83f8 -> 5c883df09 [SPARK-5732][CORE]:Add an option to print the spark version in spark script. Naturally, we may need to add an option to print the spark version in spark script. It is pretty common in script tool. ![9](https://cl

spark git commit: [SPARK-5783] Better eventlog-parsing error messages

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master e1a1ff810 -> fc6d3e796 [SPARK-5783] Better eventlog-parsing error messages Author: Ryan Williams Closes #4573 from ryan-williams/history and squashes the following commits: a8647ec [Ryan Williams] fix test calls to .replay() 98aa3fe [Rya

spark git commit: [SPARK-5783] Better eventlog-parsing error messages

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 5e6394222 -> e5690a502 [SPARK-5783] Better eventlog-parsing error messages Author: Ryan Williams Closes #4573 from ryan-williams/history and squashes the following commits: a8647ec [Ryan Williams] fix test calls to .replay() 98aa3fe

spark git commit: [SPARK-5735] Replace uses of EasyMock with Mockito

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 e5690a502 -> cc9eec1a0 [SPARK-5735] Replace uses of EasyMock with Mockito This patch replaces all uses of EasyMock with Mockito. There are two motivations for this: 1. We should use a single mocking framework in our tests in order to

spark git commit: [SPARK-5735] Replace uses of EasyMock with Mockito

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master fc6d3e796 -> 077eec2d9 [SPARK-5735] Replace uses of EasyMock with Mockito This patch replaces all uses of EasyMock with Mockito. There are two motivations for this: 1. We should use a single mocking framework in our tests in order to kee

spark git commit: [HOTFIX] Fix build break in MesosSchedulerBackendSuite

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 378c7eb0d -> 5d3cc6b3d [HOTFIX] Fix build break in MesosSchedulerBackendSuite Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5d3cc6b3 Tree: http://git-wip-us.apache.or

spark git commit: [HOTFIX] Fix build break in MesosSchedulerBackendSuite

2015-02-13 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 ad731897b -> 41603717a [HOTFIX] Fix build break in MesosSchedulerBackendSuite Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/41603717 Tree: http://git-wip-us.apach

spark git commit: [SPARK-3340] Deprecate ADD_JARS and ADD_FILES

2015-02-16 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master b1bd1dd32 -> 16687651f [SPARK-3340] Deprecate ADD_JARS and ADD_FILES I created a patch that disables the environment variables. Thereby scala or python shell log a warning message to notify user about the deprecation with the following mes

spark git commit: [SPARK-3340] Deprecate ADD_JARS and ADD_FILES

2015-02-16 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 c2a9a6176 -> d8c70fb6d [SPARK-3340] Deprecate ADD_JARS and ADD_FILES I created a patch that disables the environment variables. Thereby scala or python shell log a warning message to notify user about the deprecation with the following

spark git commit: [SPARK-5849] Handle more types of invalid JSON requests in SubmitRestProtocolMessage.parseAction

2015-02-16 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 d8c70fb6d -> 385a339a2 [SPARK-5849] Handle more types of invalid JSON requests in SubmitRestProtocolMessage.parseAction This patch improves SubmitRestProtocol's handling of invalid JSON requests in cases where those requests were pars

spark git commit: [SPARK-5849] Handle more types of invalid JSON requests in SubmitRestProtocolMessage.parseAction

2015-02-16 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 16687651f -> 58a82a788 [SPARK-5849] Handle more types of invalid JSON requests in SubmitRestProtocolMessage.parseAction This patch improves SubmitRestProtocol's handling of invalid JSON requests in cases where those requests were parsable

spark git commit: SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master e945aa613 -> fb87f4492 SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2 Author: Jacek Lewandowski Closes #4653 from jacek-lewandowski/SPARK-5548-2-master and squashes the following commits: 0e199b6 [Jacek Lewandowski] SPARK-5548:

spark git commit: SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 092b45f69 -> fbcb949c5 SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2 Author: Jacek Lewandowski Closes #4653 from jacek-lewandowski/SPARK-5548-2-master and squashes the following commits: 0e199b6 [Jacek Lewandowski] SPARK-55

spark git commit: [SPARK-5816] Add huge compatibility warning in DriverWrapper

2015-02-19 Thread andrewor14
the very least add a huge warning where appropriate. Author: Andrew Or Closes #4687 from andrewor14/driver-wrapper-warning and squashes the following commits: 7989b56 [Andrew Or] Add huge compatibility warning (cherry picked from commit 38e624a732b18e01ad2e7a499ce0bb0d7acdcdf6) Signed-off

spark git commit: [SPARK-5816] Add huge compatibility warning in DriverWrapper

2015-02-19 Thread andrewor14
ery least add a huge warning where appropriate. Author: Andrew Or Closes #4687 from andrewor14/driver-wrapper-warning and squashes the following commits: 7989b56 [Andrew Or] Add huge compatibility warning Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-

spark git commit: [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 38e624a73 -> 90095bf3c [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file This PR adds a `finalize` method in DiskMapIterator to clean up the resources even if some exception happens during p

spark git commit: [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 f93d4d992 -> 25fae8e7e [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file This PR adds a `finalize` method in DiskMapIterator to clean up the resources even if some exception happens duri

spark git commit: [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.1 651ceaeb3 -> 36f3c499f [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file This PR adds a `finalize` method in DiskMapIterator to clean up the resources even if some exception happens duri

spark git commit: [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 f6ee80b18 -> 61bde0049 [SPARK-5423][Core] Cleanup resources in DiskMapIterator.finalize to ensure deleting the temp file This PR adds a `finalize` method in DiskMapIterator to clean up the resources even if some exception happens duri

spark git commit: [SPARK-5825] [Spark Submit] Remove the double checking instance name when stopping the service

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 25fae8e7e -> fe00eb66e [SPARK-5825] [Spark Submit] Remove the double checking instance name when stopping the service `spark-daemon.sh` will confirm the process id by fuzzy matching the class name while stopping the service, however,

spark git commit: [SPARK-5825] [Spark Submit] Remove the double checking instance name when stopping the service

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 61bde0049 -> 856fdcb65 [SPARK-5825] [Spark Submit] Remove the double checking instance name when stopping the service `spark-daemon.sh` will confirm the process id by fuzzy matching the class name while stopping the service, however,

spark git commit: [SPARK-5825] [Spark Submit] Remove the double checking instance name when stopping the service

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 90095bf3c -> 94cdb05ff [SPARK-5825] [Spark Submit] Remove the double checking instance name when stopping the service `spark-daemon.sh` will confirm the process id by fuzzy matching the class name while stopping the service, however, it w

[2/2] spark git commit: SPARK-4682 [CORE] Consolidate various 'Clock' classes

2015-02-19 Thread andrewor14
SPARK-4682 [CORE] Consolidate various 'Clock' classes Another one from JoshRosen 's wish list. The first commit is much smaller and removes 2 of the 4 Clock classes. The second is much larger, necessary for consolidating the streaming one. I put together implementations in the way that seemed s

[2/2] spark git commit: SPARK-4682 [CORE] Consolidate various 'Clock' classes

2015-02-19 Thread andrewor14
SPARK-4682 [CORE] Consolidate various 'Clock' classes Another one from JoshRosen 's wish list. The first commit is much smaller and removes 2 of the 4 Clock classes. The second is much larger, necessary for consolidating the streaming one. I put together implementations in the way that seemed s

[1/2] spark git commit: SPARK-4682 [CORE] Consolidate various 'Clock' classes

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master ad6b169de -> 34b7c3538 http://git-wip-us.apache.org/repos/asf/spark/blob/34b7c353/streaming/src/test/scala/org/apache/spark/streaming/ReceivedBlockHandlerSuite.scala -- diff

[1/2] spark git commit: SPARK-4682 [CORE] Consolidate various 'Clock' classes

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 ff8976ec7 -> bd49e8b96 http://git-wip-us.apache.org/repos/asf/spark/blob/bd49e8b9/streaming/src/test/scala/org/apache/spark/streaming/ReceivedBlockHandlerSuite.scala --

spark git commit: SPARK-5570: No docs stating that `new SparkConf().set("spark.driver.memory", ...) will not work

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 bd49e8b96 -> c5f3b9e02 SPARK-5570: No docs stating that `new SparkConf().set("spark.driver.memory", ...) will not work I've updated documentation to reflect true behavior of this setting in client vs. cluster mode. Author: Ilya Ganel

spark git commit: SPARK-5570: No docs stating that `new SparkConf().set("spark.driver.memory", ...) will not work

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 34b7c3538 -> 6bddc4035 SPARK-5570: No docs stating that `new SparkConf().set("spark.driver.memory", ...) will not work I've updated documentation to reflect true behavior of this setting in client vs. cluster mode. Author: Ilya Ganelin

spark git commit: [SPARK-4808] Removing minimum number of elements read before spill check

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 ba941ceb1 -> 0382dcc0a [SPARK-4808] Removing minimum number of elements read before spill check In the general case, Spillable's heuristic of checking for memory stress on every 32nd item after 1000 items are read is good enough. In gen

spark git commit: [SPARK-4808] Removing minimum number of elements read before spill check

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 0cfd2cebd -> 3be92cdac [SPARK-4808] Removing minimum number of elements read before spill check In the general case, Spillable's heuristic of checking for memory stress on every 32nd item after 1000 items are read is good enough. In general

spark git commit: [SPARK-4808] Removing minimum number of elements read before spill check

2015-02-19 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 18fbed5b5 -> 5cea859fd [SPARK-4808] Removing minimum number of elements read before spill check In the general case, Spillable's heuristic of checking for memory stress on every 32nd item after 1000 items are read is good enough. In gen

spark git commit: SPARK-5841 [CORE] [HOTFIX 2] Memory leak in DiskBlockManager

2015-02-21 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master e15532471 -> d3cbd38c3 SPARK-5841 [CORE] [HOTFIX 2] Memory leak in DiskBlockManager Continue to see IllegalStateException in YARN cluster mode. Adding a simple workaround for now. Author: Nishkam Ravi Author: nishkamravi2 Author: nravi

spark git commit: SPARK-5841 [CORE] [HOTFIX 2] Memory leak in DiskBlockManager

2015-02-21 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 b9a6c5c84 -> 932338eda SPARK-5841 [CORE] [HOTFIX 2] Memory leak in DiskBlockManager Continue to see IllegalStateException in YARN cluster mode. Adding a simple workaround for now. Author: Nishkam Ravi Author: nishkamravi2 Author: nr

spark git commit: [SPARK-5937][YARN] Fix ClientSuite to set YARN mode, so that the correct class is used in t...

2015-02-21 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master d3cbd38c3 -> 7138816ab [SPARK-5937][YARN] Fix ClientSuite to set YARN mode, so that the correct class is used in t... ...ests. Without this SparkHadoopUtil is used by the Client instead of YarnSparkHadoopUtil. Author: Hari Shreedharan

spark git commit: [SPARK-5937][YARN] Fix ClientSuite to set YARN mode, so that the correct class is used in t...

2015-02-21 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 932338eda -> 76e3e6527 [SPARK-5937][YARN] Fix ClientSuite to set YARN mode, so that the correct class is used in t... ...ests. Without this SparkHadoopUtil is used by the Client instead of YarnSparkHadoopUtil. Author: Hari Shreedhar

spark git commit: Revert "[SPARK-4808] Removing minimum number of elements read before spill check"

2015-02-22 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 eed7389cf -> 4186dd3dd Revert "[SPARK-4808] Removing minimum number of elements read before spill check" This reverts commit 0382dcc0a94f8e619fd11ec2cc0b18459a690c2b. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit:

spark git commit: Revert "[SPARK-4808] Removing minimum number of elements read before spill check"

2015-02-24 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 71173de7a -> 2c9d9659d Revert "[SPARK-4808] Removing minimum number of elements read before spill check" This reverts commit 5cea859fd27dc6a216fa9d31d293c93407fbff01. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit:

spark git commit: [Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds

2015-02-24 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 201236628 -> 64d2c01ff [Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds Patch should be self-explanatory pwendell JoshRosen Author: Tathagata Das Closes #4741 from tdas/SPARK-5967 and squashes the following com

spark git commit: [Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds

2015-02-24 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 2c9d9659d -> 3ad00ee1c [Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds Patch should be self-explanatory pwendell JoshRosen Author: Tathagata Das Closes #4741 from tdas/SPARK-5967 and squashes the following

spark git commit: [Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds

2015-02-24 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 e46096b1e -> 28dd53b1b [Spark-5967] [UI] Correctly clean JobProgressListener.stageIdToActiveJobIds Patch should be self-explanatory pwendell JoshRosen Author: Tathagata Das Closes #4741 from tdas/SPARK-5967 and squashes the following

spark git commit: [SPARK-5965] Standalone Worker UI displays {{USER_JAR}}

2015-02-24 Thread andrewor14
ses #4739 from andrewor14/user-jar-blocker and squashes the following commits: 23c4a9e [Andrew Or] Use right argument Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6d2caa57 Tree: http://git-wip-us.apache.org/repos/asf/spark/t

spark git commit: [SPARK-5965] Standalone Worker UI displays {{USER_JAR}}

2015-02-24 Thread andrewor14
Or Closes #4739 from andrewor14/user-jar-blocker and squashes the following commits: 23c4a9e [Andrew Or] Use right argument (cherry picked from commit 6d2caa576fcdc5c848d1472b09c685b3871e220e) Signed-off-by: Andrew Or Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-

spark git commit: SPARK-4704 [CORE] SparkSubmitDriverBootstrap doesn't flush output

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 7fa960e65 -> cd5c8d7bb SPARK-4704 [CORE] SparkSubmitDriverBootstrap doesn't flush output Join on output threads to make sure any lingering output from process reaches stdout, stderr before exiting CC andrewor14 since I believe he

spark git commit: SPARK-4704 [CORE] SparkSubmitDriverBootstrap doesn't flush output

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 cc7313d09 -> 602d5c1fc SPARK-4704 [CORE] SparkSubmitDriverBootstrap doesn't flush output Join on output threads to make sure any lingering output from process reaches stdout, stderr before exiting CC andrewor14 since I be

spark git commit: Modify default value description for spark.scheduler.minRegisteredResourcesRatio on docs.

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master cd5c8d7bb -> 10094a523 Modify default value description for spark.scheduler.minRegisteredResourcesRatio on docs. The configuration is not supported in mesos mode now. See https://github.com/apache/spark/pull/1462 Author: Li Zhihui Close

spark git commit: Modify default value description for spark.scheduler.minRegisteredResourcesRatio on docs.

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 602d5c1fc -> 94faf4c49 Modify default value description for spark.scheduler.minRegisteredResourcesRatio on docs. The configuration is not supported in mesos mode now. See https://github.com/apache/spark/pull/1462 Author: Li Zhihui C

spark git commit: Modify default value description for spark.scheduler.minRegisteredResourcesRatio on docs.

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 5d309ad6c -> 62652dc5b Modify default value description for spark.scheduler.minRegisteredResourcesRatio on docs. The configuration is not supported in mesos mode now. See https://github.com/apache/spark/pull/1462 Author: Li Zhihui C

spark git commit: [SPARK-3562]Periodic cleanup event logs

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 10094a523 -> 8942b522d [SPARK-3562]Periodic cleanup event logs Author: xukun 00228947 Closes #4214 from viper-kun/cleaneventlog and squashes the following commits: 7a5b9c5 [xukun 00228947] fix issue 31674ee [xukun 00228947] fix issue 6e3

spark git commit: [SPARK-6027][SPARK-5546] Fixed --jar and --packages not working for KafkaUtils and improved error message

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 8942b522d -> aa63f633d [SPARK-6027][SPARK-5546] Fixed --jar and --packages not working for KafkaUtils and improved error message The problem with SPARK-6027 in short is that JARs like the kafka-assembly.jar does not work in python as the

spark git commit: [SPARK-6027][SPARK-5546] Fixed --jar and --packages not working for KafkaUtils and improved error message

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 62652dc5b -> 731a997db [SPARK-6027][SPARK-5546] Fixed --jar and --packages not working for KafkaUtils and improved error message The problem with SPARK-6027 in short is that JARs like the kafka-assembly.jar does not work in python as

spark git commit: [SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master aa63f633d -> 5f3238b3b [SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM Author: Cheolsoo Park Closes #4773 from piaozhexiu/SPARK-6018 and squashes the following commits: 2a919d5 [Cheolsoo Park] Rename e with cau

spark git commit: [SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 731a997db -> fe7967483 [SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM Author: Cheolsoo Park Closes #4773 from piaozhexiu/SPARK-6018 and squashes the following commits: 2a919d5 [Cheolsoo Park] Rename e with

spark git commit: [SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 94faf4c49 -> e21475d16 [SPARK-6018] [YARN] NoSuchMethodError in Spark app is swallowed by YARN AM Author: Cheolsoo Park Closes #4773 from piaozhexiu/SPARK-6018 and squashes the following commits: 2a919d5 [Cheolsoo Park] Rename e with

spark git commit: SPARK-4300 [CORE] Race condition during SparkWorker shutdown

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 2d83442f2 -> 64e0cbc73 SPARK-4300 [CORE] Race condition during SparkWorker shutdown Close appender saving stdout/stderr before destroying process to avoid exception on reading closed input stream. (This also removes a redundant `waitFo

spark git commit: SPARK-4300 [CORE] Race condition during SparkWorker shutdown

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/master 5f3238b3b -> 3fb53c029 SPARK-4300 [CORE] Race condition during SparkWorker shutdown Close appender saving stdout/stderr before destroying process to avoid exception on reading closed input stream. (This also removes a redundant `waitFor()`

spark git commit: Add a note for context termination for History server on Yarn

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.3 fe7967483 -> 297c3ef82 Add a note for context termination for History server on Yarn The history server on Yarn only shows completed jobs. This adds a note concerning the needed explicit context termination at the end of a spark job w

spark git commit: Add a note for context termination for History server on Yarn

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.2 64e0cbc73 -> 58b3aa692 Add a note for context termination for History server on Yarn The history server on Yarn only shows completed jobs. This adds a note concerning the needed explicit context termination at the end of a spark job w

spark git commit: Add a note for context termination for History server on Yarn

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.0 f74bccbe3 -> 14e042b65 Add a note for context termination for History server on Yarn The history server on Yarn only shows completed jobs. This adds a note concerning the needed explicit context termination at the end of a spark job w

spark git commit: Add a note for context termination for History server on Yarn

2015-02-26 Thread andrewor14
Repository: spark Updated Branches: refs/heads/branch-1.1 36f3c499f -> 2785210fa Add a note for context termination for History server on Yarn The history server on Yarn only shows completed jobs. This adds a note concerning the needed explicit context termination at the end of a spark job w

<    3   4   5   6   7   8   9   10   11   12   >