Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18901
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80562/
Test FAILed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18901
**[Test build #80562 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80562/testReport)**
for PR 18901 at commit
Github user tgravescs commented on the issue:
https://github.com/apache/spark/pull/18901
If this is a common problem for your users why not just install the hbase
jars on the launcher box?
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18900
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80558/
Test FAILed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18900
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18810
**[Test build #80564 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80564/testReport)**
for PR 18810 at commit
Github user debugger87 commented on the issue:
https://github.com/apache/spark/pull/18900
`createTime` is set by HiveMetaStore#initializeAddedPartition
```
private void initializeAddedPartition(Table tbl, PartitionIterator part,
boolean madeDir) throws MetaException {
Github user debugger87 closed the pull request at:
https://github.com/apache/spark/pull/18900
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18901
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80559/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18901
Build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18901
**[Test build #80559 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80559/testReport)**
for PR 18901 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18810
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80563/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18810
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user joseph-torres commented on a diff in the pull request:
https://github.com/apache/spark/pull/18925#discussion_r132795536
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala
---
@@ -779,10 +780,16 @@ case object
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18700
**[Test build #80553 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80553/testReport)**
for PR 18700 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18700
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18700
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80553/
Test FAILed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18927
**[Test build #80550 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80550/testReport)**
for PR 18927 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18927
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18519
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80552/
Test PASSed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18928
**[Test build #80557 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80557/testReport)**
for PR 18928 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18519
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user marmbrus commented on a diff in the pull request:
https://github.com/apache/spark/pull/18923#discussion_r132801760
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/console.scala
---
@@ -49,7 +49,7 @@ class ConsoleSink(options: Map[String,
Github user caneGuy commented on the issue:
https://github.com/apache/spark/pull/18901
But i think this case should be fixed since many users of our inner branch
has suffered from this problem.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18901
**[Test build #80562 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80562/testReport)**
for PR 18901 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18810
**[Test build #80563 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80563/testReport)**
for PR 18810 at commit
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/18915
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or
Github user wangyum commented on the issue:
https://github.com/apache/spark/pull/18853
Thanks @maropu, There are some problems:
```:sql
spark-sql> select "20" > "100";
true
spark-sql>
```
So [`tmap.tkey <
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18915
**[Test build #80567 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80567/testReport)**
for PR 18915 at commit
Github user gatorsmile commented on the issue:
https://github.com/apache/spark/pull/18907
```[error]
/home/jenkins/workspace/SparkPullRequestBuilder/sql/hive/target/java/org/apache/spark/sql/hive/FindHiveTable.java:3:
error: reference not found
[error] * Replaces {@link
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18810
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18810
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80564/
Test PASSed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18810
**[Test build #80564 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80564/testReport)**
for PR 18810 at commit
Github user heary-cao commented on the issue:
https://github.com/apache/spark/pull/18916
we can get the description of these configuration parameters directly from
the code, except documents. so it's always good to add these descriptions to
the code.
---
If your project is set up
Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/18927
Merged to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18927
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18893
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user pjfanning opened a pull request:
https://github.com/apache/spark/pull/18921
[SPARK-21709][Build] sbt 0.13.16 and some plugin updates
## What changes were proposed in this pull request?
Update sbt version to 0.13.16. I think this is a useful stepping stone to
GitHub user neoremind opened a pull request:
https://github.com/apache/spark/pull/18922
[SPARK-21701][CORE] Enable RPC client to use SO_RCVBUF, SO_SNDBUF and
SO_BACKLOG in SparkConf
## What changes were proposed in this pull request?
1. TCP parameters like SO_RCVBUF,
Github user tejasapatil commented on the issue:
https://github.com/apache/spark/pull/18843
jenkins test this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user jiangxb1987 commented on a diff in the pull request:
https://github.com/apache/spark/pull/18913#discussion_r132706353
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1792,6 +1796,9 @@ class SparkContext(config: SparkConf) extends Logging
{
Github user hvanhovell commented on the issue:
https://github.com/apache/spark/pull/18907
I think something is up jenkins. @shaneknapp could you take a look?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18924
**[Test build #80524 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80524/testReport)**
for PR 18924 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18914
**[Test build #80525 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80525/testReport)**
for PR 18914 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18895
**[Test build #80530 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80530/testReport)**
for PR 18895 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18855
**[Test build #80534 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80534/testReport)**
for PR 18855 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18843
**[Test build #80536 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80536/testReport)**
for PR 18843 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18916
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18892
**[Test build #80531 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80531/testReport)**
for PR 18892 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18907
**[Test build #80526 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80526/testReport)**
for PR 18907 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18899
**[Test build #80529 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80529/testReport)**
for PR 18899 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18756
**[Test build #80538 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80538/testReport)**
for PR 18756 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18900
**[Test build #80528 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80528/testReport)**
for PR 18900 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18875
**[Test build #80532 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80532/testReport)**
for PR 18875 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18872
**[Test build #80533 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80533/testReport)**
for PR 18872 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18904
**[Test build #80527 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80527/testReport)**
for PR 18904 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18555
**[Test build #80540 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80540/testReport)**
for PR 18555 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18724
**[Test build #80539 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80539/testReport)**
for PR 18724 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18810
**[Test build #80537 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80537/testReport)**
for PR 18810 at commit
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18846
**[Test build #80535 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80535/testReport)**
for PR 18846 at commit
Github user sethah commented on the issue:
https://github.com/apache/spark/pull/18899
I think there _is_ new functionality, a new method that needs its
functionality defined. One specific example, we need a test like:
scala
test("toSparseWithSize") {
val dv
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18907
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80526/
Test FAILed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18907
**[Test build #80526 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80526/testReport)**
for PR 18907 at commit
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18907
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18924
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18907
**[Test build #80543 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80543/testReport)**
for PR 18907 at commit
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/18849#discussion_r132751607
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -1175,6 +1205,27 @@ private[spark] class
Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/18914#discussion_r132754247
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/JoinSuite.scala ---
@@ -141,6 +141,7 @@ class JoinSuite extends QueryTest with SharedSQLContext
{
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18921
**[Test build #3888 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3888/testReport)**
for PR 18921 at commit
GitHub user maasg opened a pull request:
https://github.com/apache/spark/pull/18923
[SPARK-21710][SS] Fix OOM on ConsoleSink with large inputs
## What changes were proposed in this pull request?
Replace a full `collect` with a `take` using the expected number of
elements
Github user ArtRand commented on the issue:
https://github.com/apache/spark/pull/18519
@vanzin Fixed this up. Please have a look. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user tejasapatil commented on a diff in the pull request:
https://github.com/apache/spark/pull/16985#discussion_r132715845
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/sources/BucketedReadSuite.scala ---
@@ -543,6 +551,68 @@ abstract class BucketedReadSuite
Github user ArtRand commented on a diff in the pull request:
https://github.com/apache/spark/pull/18910#discussion_r132716461
--- Diff:
resource-managers/mesos/src/test/scala/org/apache/spark/scheduler/cluster/mesos/MesosCoarseGrainedSchedulerBackendSuite.scala
---
@@ -582,6
Github user ArtRand commented on a diff in the pull request:
https://github.com/apache/spark/pull/18910#discussion_r132715831
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendUtil.scala
---
@@ -21,6 +21,7 @@ import
Github user ArtRand commented on a diff in the pull request:
https://github.com/apache/spark/pull/18910#discussion_r132715924
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackendUtil.scala
---
@@ -162,7 +163,11 @@
Github user ArtRand commented on a diff in the pull request:
https://github.com/apache/spark/pull/18910#discussion_r132715684
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/deploy/mesos/config.scala
---
@@ -70,4 +70,12 @@ package object config {
Github user tejasapatil commented on the issue:
https://github.com/apache/spark/pull/16985
jenkins test this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user ArtRand commented on the issue:
https://github.com/apache/spark/pull/18837
Hello @srowen could you have a look at this (and green light the testing)
when you have a chance?
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user cloud-fan commented on the issue:
https://github.com/apache/spark/pull/18907
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user ash211 commented on a diff in the pull request:
https://github.com/apache/spark/pull/18913#discussion_r132721021
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1792,6 +1796,9 @@ class SparkContext(config: SparkConf) extends Logging
{
Github user tgravescs commented on the issue:
https://github.com/apache/spark/pull/18846
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user jiangxb1987 commented on the issue:
https://github.com/apache/spark/pull/18915
LGTM, could you also post some screen shots to show the effectiveness of
these changes? Thanks !
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18893
Merged to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or
Github user thide commented on the issue:
https://github.com/apache/spark/pull/18846
@zsxwing Thank you for the pointer. I tested manually, as far as I tested,
Spark works as expected even if we apply this patch. I was able to confirm that
driver/executor shut down when its
Github user tgravescs commented on the issue:
https://github.com/apache/spark/pull/18874
The minimum count is still needed, its needed between stages when the
number of tasks goes below the minimum count. Its either going to keep minimum
number of executors or enough executors to
Github user debugger87 commented on a diff in the pull request:
https://github.com/apache/spark/pull/18900#discussion_r132711854
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala
---
@@ -986,6 +986,7 @@ private[hive] object HiveClientImpl {
301 - 386 of 386 matches
Mail list logo