Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2090#issuecomment-53143937
QA results for PR 2090:br- This patch PASSES unit tests.br- This patch
merges cleanlybr- This patch adds no public classesbrbrFor more
information see test
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2091#issuecomment-53144107
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19094/consoleFull)
for PR 2091 at commit
Github user uncleGen commented on the pull request:
https://github.com/apache/spark/pull/2076#issuecomment-53144395
@pwendell Okay! I will add them as soon as possible and pay more attention.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2083#issuecomment-53144520
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19095/consoleFull)
for PR 2083 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1959#issuecomment-53144788
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19096/consoleFull)
for PR 1959 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2083#issuecomment-53145524
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19095/consoleFull)
for PR 2083 at commit
Github user davies commented on the pull request:
https://github.com/apache/spark/pull/2093#issuecomment-53145616
The doc tests should covered all the code paths, do we still need more
tests?
---
If your project is set up for it, you can reply to this email and have your
reply
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2093#issuecomment-53145628
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19097/consoleFull)
for PR 2093 at commit
Github user davies commented on the pull request:
https://github.com/apache/spark/pull/2092#issuecomment-53145677
I think doc tests should be enough.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user davies commented on the pull request:
https://github.com/apache/spark/pull/2091#issuecomment-53145718
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2091#issuecomment-53145884
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19098/consoleFull)
for PR 2091 at commit
GitHub user tnachen opened a pull request:
https://github.com/apache/spark/pull/2103
[SPARK-2608] fix executor backend launch commond over mesos mode
based on @scwf patch, rebased on master and have a fix to actually get it
to work.
It failed to run with a single mesos
Github user tnachen commented on the pull request:
https://github.com/apache/spark/pull/2103#issuecomment-53146097
@pwendell take a look at the new fix
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2103#issuecomment-53146211
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19099/consoleFull)
for PR 2103 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1959#issuecomment-53146504
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19096/consoleFull)
for PR 1959 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2093#issuecomment-53146874
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19097/consoleFull)
for PR 2093 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2091#issuecomment-53147036
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19098/consoleFull)
for PR 2091 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2019#issuecomment-53147147
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19100/consoleFull)
for PR 2019 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2103#issuecomment-53147354
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19099/consoleFull)
for PR 2103 at commit
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/2102#discussion_r16629723
--- Diff: docs/building-with-maven.md ---
@@ -156,4 +156,12 @@ then ship it over to the cluster. We are investigating
the exact cause for this.
The
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1886#issuecomment-53147680
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19101/consoleFull)
for PR 1886 at commit
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/1886#issuecomment-53147717
I rebased #1994 to this PR for now, and rename the title of this PR to
proper one.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2014#issuecomment-53147889
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19102/consoleFull)
for PR 2014 at commit
GitHub user sarutak opened a pull request:
https://github.com/apache/spark/pull/2104
[SPARK-3192] Some scripts have 2 space indentation but other scripts have 4
space indentation.
You can merge this pull request into a Git repository by running:
$ git pull
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1726#issuecomment-53148099
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19105/consoleFull)
for PR 1726 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2014#issuecomment-53148101
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19103/consoleFull)
for PR 2014 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2104#issuecomment-53148100
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19104/consoleFull)
for PR 2104 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2019#issuecomment-53148302
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19100/consoleFull)
for PR 2019 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2014#issuecomment-53148996
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19102/consoleFull)
for PR 2014 at commit
Github user yu-iskw commented on the pull request:
https://github.com/apache/spark/pull/1964#issuecomment-53149119
BYW, I checked the performance of Math.abs() and breeze.numerics.abs.
It seems that Math.abs() performs better than breeze.numerics.abs.
A performs better than B.
Github user witgo commented on the pull request:
https://github.com/apache/spark/pull/2056#issuecomment-53149146
In `removeShuffleBlocks`
```
for (mapId - state.completedMapTasks; reduceId - 0 until
state.numBuckets) {
val blockId = new
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2104#issuecomment-53149178
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19104/consoleFull)
for PR 2104 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1886#issuecomment-53149212
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19101/consoleFull)
for PR 1886 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2014#issuecomment-53149241
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19103/consoleFull)
for PR 2014 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1726#issuecomment-53149251
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19105/consoleFull)
for PR 1726 at commit
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16630100
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -118,14 +118,33 @@ abstract class Connection(val channel: SocketChannel,
val
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16630105
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16630107
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/2019#issuecomment-53149673
handling tcp/ip events is by definition async, particularly when state
changes can happen orthogonal to state within java variables.
so there is only so much you can
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2056#issuecomment-53149761
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19106/consoleFull)
for PR 2056 at commit
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53149962
Regarding logging -
host:port is insufficient to debug (and as I mentioned above, you will
get NPE for the log messages in this PR depending on the key's state) -
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53150028
I removed try / catch because selector.select never throws
CancelledKeyException.
If it can be regression, could you show me how CancelledKeyException is
thrown from
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53150053
If you do want to get to host port in case logs are noisy (happens!),
ensure you retrieve it with robust code.
This pr can cause npe's
On 23-Aug-2014 4:42 pm,
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53150077
It can and it does, if key was cancelled
On 23-Aug-2014 4:46 pm, Kousuke Saruta notificati...@github.com wrote:
I removed try / catch because selector.select
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2056#issuecomment-53150684
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19107/consoleFull)
for PR 2056 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2056#issuecomment-53150822
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19106/consoleFull)
for PR 2056 at commit
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2092#issuecomment-53151265
fair enough
+1 lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/2097#issuecomment-53151295
enthat's exactly the motivation which I proposed this PR
forContextCleaner sometimes mistakenly cleans my broadcast files in my test
cases...about one month
Github user mattf commented on a diff in the pull request:
https://github.com/apache/spark/pull/2094#discussion_r16630356
--- Diff: python/pyspark/rdd.py ---
@@ -810,23 +810,45 @@ def func(iterator):
return self.mapPartitions(func).fold(zeroValue, combOp)
Github user mattf commented on a diff in the pull request:
https://github.com/apache/spark/pull/2094#discussion_r16630361
--- Diff: python/pyspark/rdd.py ---
@@ -810,23 +810,45 @@ def func(iterator):
return self.mapPartitions(func).fold(zeroValue, combOp)
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2094#issuecomment-53151507
agreed re doctest. i forgot it was in use.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user scwf commented on the pull request:
https://github.com/apache/spark/pull/2103#issuecomment-53151568
what's the diff with my pr https://github.com/apache/spark/pull/1986?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user mattf commented on a diff in the pull request:
https://github.com/apache/spark/pull/2091#discussion_r16630390
--- Diff: python/pyspark/rdd.py ---
@@ -856,6 +856,104 @@ def redFunc(left_counter, right_counter):
return self.mapPartitions(lambda i:
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2093#issuecomment-53151737
The doc tests should covered all the code paths, do we still need more
tests?
it's worth including a lookup for 1000 or 1234, which won't be found
---
If your
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2056#issuecomment-53151891
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19107/consoleFull)
for PR 2056 at commit
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53151994
Thanks @mridulm .
The answer I want is why selector.select throws CancelledKeyException even
through JavaDoc doesn't say select() throws the exception.
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53153441
Why it happens is anyone's guess, I know it happens since fairly the very
beginning when I started using nio in 1.4.x; and continues till today. If you
search online,
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53154553
I don't mind handling NPE but I wonder why / where / how NPE is thrown.
At least, JavaDoc doesn't say SelectionKey#channel, SocketChannel#socket
return null.
And
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53154886
I am not on mac - and have seen this on solaris and linux (not in ctx of
spark); so the issue is not mac specific unfortunately (though it might also be
on mac !).
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/2102#issuecomment-53155103
Hey @loachli - thanks for looking into this. I don't think we can advise
users to disable security settings for their maven build. Does your proxy
support HTTPS?
---
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53155129
Looking at the code :
'''
val remoteAddress =
key.channel.asInstanceOf[SocketChannel].socket.getRemoteSocketAddress
'''
I agree, key.channel is document
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/2078#issuecomment-53155180
/CC @JoshRosen since you looked at ConnectionManager and Connection
recently.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user sarutak commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16631095
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/2011#issuecomment-53155439
This seems like a reasonable change. One issue is it does run the `java`
binary an extra time on task lunch, but that seems fairly cheap when only
asking for the
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/2011
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user tnachen commented on the pull request:
https://github.com/apache/spark/pull/2103#issuecomment-53156348
My last commit is the diff in this pr
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user tnachen commented on the pull request:
https://github.com/apache/spark/pull/2103#issuecomment-53156411
Btw this is still not ideal IMO since it computes the class path in the
scheduler side and assumes all slave executors after unzip has the same setup.
---
If your
GitHub user viirya opened a pull request:
https://github.com/apache/spark/pull/2105
[Minor] fix typo
Fix a typo in comment.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/viirya/spark-1 fix_typo
Alternatively you can review and
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2105#issuecomment-53156760
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user sarutak commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16631373
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -118,14 +118,33 @@ abstract class Connection(val channel: SocketChannel,
val
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2014#issuecomment-53157117
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19108/consoleFull)
for PR 2014 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2014#issuecomment-53158753
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19108/consoleFull)
for PR 2014 at commit
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2105#issuecomment-53158982
I merged this; thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/2105
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16631699
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -118,14 +118,33 @@ abstract class Connection(val channel: SocketChannel,
val
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16631703
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/2076#issuecomment-53159750
@andrewor14 - can you take a look at this patch? IIRC you worked on this
code most recently.
---
If your project is set up for it, you can reply to this email and have
Github user davies commented on the pull request:
https://github.com/apache/spark/pull/2093#issuecomment-53159927
@mattf I had added a test case for it, thx.
I had do much refactor in this PR, please re-review it, thanks.
---
If your project is set up for it, you can reply
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2093#issuecomment-53159943
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19109/consoleFull)
for PR 2093 at commit
Github user sarutak commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16631901
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user davies commented on a diff in the pull request:
https://github.com/apache/spark/pull/2094#discussion_r16631909
--- Diff: python/pyspark/rdd.py ---
@@ -810,23 +810,45 @@ def func(iterator):
return self.mapPartitions(func).fold(zeroValue, combOp)
Github user sarutak commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16631910
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -118,14 +118,33 @@ abstract class Connection(val channel: SocketChannel,
val
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2094#issuecomment-53160372
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19110/consoleFull)
for PR 2094 at commit
Github user davies commented on a diff in the pull request:
https://github.com/apache/spark/pull/2094#discussion_r16631953
--- Diff: python/pyspark/rdd.py ---
@@ -810,23 +810,45 @@ def func(iterator):
return self.mapPartitions(func).fold(zeroValue, combOp)
Github user davies commented on a diff in the pull request:
https://github.com/apache/spark/pull/2094#discussion_r16631962
--- Diff: python/pyspark/rdd.py ---
@@ -810,23 +810,45 @@ def func(iterator):
return self.mapPartitions(func).fold(zeroValue, combOp)
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16631998
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16632025
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -118,14 +118,33 @@ abstract class Connection(val channel: SocketChannel,
val
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2093#issuecomment-53161474
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19109/consoleFull)
for PR 2093 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2094#issuecomment-53161691
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19110/consoleFull)
for PR 2094 at commit
Github user sarutak commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16632233
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user sarutak commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16632326
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -118,14 +118,33 @@ abstract class Connection(val channel: SocketChannel,
val
Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/2019#discussion_r16632362
--- Diff: core/src/main/scala/org/apache/spark/network/Connection.scala ---
@@ -263,14 +282,20 @@ class SendingConnection(val address:
InetSocketAddress,
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1935#issuecomment-53163399
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19111/consoleFull)
for PR 1935 at commit
Github user markhamstra commented on the pull request:
https://github.com/apache/spark/pull/1360#issuecomment-53163415
I'm not sure I'm following, @mridulm. The problem is not one of removing
Executors, but rather of removing Applications that could and should still be
left running
Github user marmbrus commented on the pull request:
https://github.com/apache/spark/pull/2074#issuecomment-53163432
Thanks! Merged to master and 1.1.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/2074
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/1360#issuecomment-53163562
@markhamstra In our cluster, this usually happens due to one or more
executor being in a bad state : either due to insufficient disk for finishing a
task or it is in
Github user markhamstra commented on the pull request:
https://github.com/apache/spark/pull/1360#issuecomment-53164183
@mridulm Is this blacklisting behavior a customization that you have made
to Spark? If not, could you point me to where and how it is implemented?
What you
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/1360#issuecomment-53164487
Take a look at 'spark.scheduler.executorTaskBlacklistTime' in
TaskSetManager.
Since I run mostly in yarn-cluster mode, and there is only single
application there; I
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1935#issuecomment-53165766
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19111/consoleFull)
for PR 1935 at commit
1 - 100 of 147 matches
Mail list logo