[GitHub] [spark] AmplabJenkins commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER DATABASE SET LOCATION

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER 
DATABASE SET LOCATION
URL: https://github.com/apache/spark/pull/25294#issuecomment-533768183
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER DATABASE SET LOCATION

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25294: [WIP][SPARK-28476][SQL] 
Support ALTER DATABASE SET LOCATION
URL: https://github.com/apache/spark/pull/25294#issuecomment-533768185
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16191/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER DATABASE SET LOCATION

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER 
DATABASE SET LOCATION
URL: https://github.com/apache/spark/pull/25294#issuecomment-533768185
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16191/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER DATABASE SET LOCATION

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25294: [WIP][SPARK-28476][SQL] 
Support ALTER DATABASE SET LOCATION
URL: https://github.com/apache/spark/pull/25294#issuecomment-533768183
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER DATABASE SET LOCATION

2019-09-20 Thread GitBox
SparkQA commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER 
DATABASE SET LOCATION
URL: https://github.com/apache/spark/pull/25294#issuecomment-533768110
 
 
   **[Test build #10 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/10/testReport)**
 for PR 25294 at commit 
[`2a95580`](https://github.com/apache/spark/commit/2a95580b985dfcf43e2c4d3336ce0286b7c245c0).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25746: [WIP][SPARK-28292][SQL] Enable Injection of User-defined Hint

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25746: [WIP][SPARK-28292][SQL] Enable 
Injection of User-defined Hint 
URL: https://github.com/apache/spark/pull/25746#issuecomment-533768107
 
 
   Got it. Thank you for making this change, @gatorsmile !


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] wangyum commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER DATABASE SET LOCATION

2019-09-20 Thread GitBox
wangyum commented on issue #25294: [WIP][SPARK-28476][SQL] Support ALTER 
DATABASE SET LOCATION
URL: https://github.com/apache/spark/pull/25294#issuecomment-533768021
 
 
   retest this please


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static partition overwrite, spark may give duplicate result.

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25863: [WIP][SPARK-29037][CORE][SQL] 
For static partition overwrite, spark may give duplicate result.
URL: https://github.com/apache/spark/pull/25863#issuecomment-533767809
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16190/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static partition overwrite, spark may give duplicate result.

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25863: [WIP][SPARK-29037][CORE][SQL] 
For static partition overwrite, spark may give duplicate result.
URL: https://github.com/apache/spark/pull/25863#issuecomment-533767808
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static partition overwrite, spark may give duplicate result.

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For 
static partition overwrite, spark may give duplicate result.
URL: https://github.com/apache/spark/pull/25863#issuecomment-533767809
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16190/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static partition overwrite, spark may give duplicate result.

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For 
static partition overwrite, spark may give duplicate result.
URL: https://github.com/apache/spark/pull/25863#issuecomment-533767808
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static partition overwrite, spark may give duplicate result.

2019-09-20 Thread GitBox
SparkQA commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static 
partition overwrite, spark may give duplicate result.
URL: https://github.com/apache/spark/pull/25863#issuecomment-533767738
 
 
   **[Test build #09 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/09/testReport)**
 for PR 25863 at commit 
[`a4f33d9`](https://github.com/apache/spark/commit/a4f33d97aa2747973eb39e0218ac42b591e8b473).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25877: [NO-NOT-MERGE][test-maven] Verify timeout for maven SparkPullRequestBuilder

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25877: [NO-NOT-MERGE][test-maven] Verify 
timeout for maven SparkPullRequestBuilder
URL: https://github.com/apache/spark/pull/25877#issuecomment-533767588
 
 
   @wangyum . Do you need to increase this to pass your 
https://github.com/apache/spark/pull/25690?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] turboFei commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static partition overwrite, spark may give duplicate result.

2019-09-20 Thread GitBox
turboFei commented on issue #25863: [WIP][SPARK-29037][CORE][SQL] For static 
partition overwrite, spark may give duplicate result.
URL: https://github.com/apache/spark/pull/25863#issuecomment-533767506
 
 
   retest this please


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun edited a comment on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
dongjoon-hyun edited a comment on issue #25879: [SPARK-29199][INFRA] Add 
linters and license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533767464
 
 
   `Linters` job passed in `9 min` already and this PR is irrelevant to Jenkins.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25879: [SPARK-29199][INFRA] Add linters and 
license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533767464
 
 
   `Linters` job passed in `9 min` already.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add 
linters and license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533767093
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16189/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add 
linters and license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533767092
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and 
license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533767092
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and 
license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533767093
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16189/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25827: [SPARK-29128][SQL] Split predicate code in OR expressions

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25827: [SPARK-29128][SQL] Split 
predicate code in OR expressions
URL: https://github.com/apache/spark/pull/25827#issuecomment-533767013
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/111099/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25827: [SPARK-29128][SQL] Split predicate code in OR expressions

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25827: [SPARK-29128][SQL] Split 
predicate code in OR expressions
URL: https://github.com/apache/spark/pull/25827#issuecomment-533767011
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
SparkQA commented on issue #25879: [SPARK-29199][INFRA] Add linters and 
license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533767026
 
 
   **[Test build #08 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/08/testReport)**
 for PR 25879 at commit 
[`e6318df`](https://github.com/apache/spark/commit/e6318dfbdae2898c4745fdc6fa8a30f02753ab9f).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25827: [SPARK-29128][SQL] Split predicate code in OR expressions

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25827: [SPARK-29128][SQL] Split predicate 
code in OR expressions
URL: https://github.com/apache/spark/pull/25827#issuecomment-533767011
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25827: [SPARK-29128][SQL] Split predicate code in OR expressions

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25827: [SPARK-29128][SQL] Split predicate 
code in OR expressions
URL: https://github.com/apache/spark/pull/25827#issuecomment-533767013
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/111099/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA removed a comment on issue #25827: [SPARK-29128][SQL] Split predicate code in OR expressions

2019-09-20 Thread GitBox
SparkQA removed a comment on issue #25827: [SPARK-29128][SQL] Split predicate 
code in OR expressions
URL: https://github.com/apache/spark/pull/25827#issuecomment-533747209
 
 
   **[Test build #111099 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/111099/testReport)**
 for PR 25827 at commit 
[`ca6522a`](https://github.com/apache/spark/commit/ca6522ab1ed7edbeaf46a306de3de8d5828a528a).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25827: [SPARK-29128][SQL] Split predicate code in OR expressions

2019-09-20 Thread GitBox
SparkQA commented on issue #25827: [SPARK-29128][SQL] Split predicate code in 
OR expressions
URL: https://github.com/apache/spark/pull/25827#issuecomment-533766923
 
 
   **[Test build #111099 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/111099/testReport)**
 for PR 25827 at commit 
[`ca6522a`](https://github.com/apache/spark/commit/ca6522ab1ed7edbeaf46a306de3de8d5828a528a).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add 
linters and license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533766766
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16188/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25879: [SPARK-29199][INFRA] Add 
linters and license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533766764
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and 
license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533766766
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16188/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25879: [SPARK-29199][INFRA] Add linters and 
license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533766764
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
SparkQA commented on issue #25879: [SPARK-29199][INFRA] Add linters and 
license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879#issuecomment-533766701
 
 
   **[Test build #07 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/07/testReport)**
 for PR 25879 at commit 
[`9a48846`](https://github.com/apache/spark/commit/9a4884674ea321af896985c8e9e34721ad989ec1).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25878: [SPARK-29162][SQL]Simplify NOT(IsNull(x)) and NOT(IsNotNull(x))

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25878: [SPARK-29162][SQL]Simplify 
NOT(IsNull(x)) and NOT(IsNotNull(x))
URL: https://github.com/apache/spark/pull/25878#issuecomment-533766427
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25878: [SPARK-29162][SQL]Simplify NOT(IsNull(x)) and NOT(IsNotNull(x))

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25878: [SPARK-29162][SQL]Simplify 
NOT(IsNull(x)) and NOT(IsNotNull(x))
URL: https://github.com/apache/spark/pull/25878#issuecomment-533766680
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun opened a new pull request #25879: [SPARK-29199][INFRA] Add linters and license/dependency checkers to GitHub Action

2019-09-20 Thread GitBox
dongjoon-hyun opened a new pull request #25879: [SPARK-29199][INFRA] Add 
linters and license/dependency checkers to GitHub Action
URL: https://github.com/apache/spark/pull/25879
 
 
   ### What changes were proposed in this pull request?
   
   This PR aims to add linters and license/dependency checkers to GitHub Action.
   
   ### Why are the changes needed?
   
   This will help the PR reviews.
   
   ### Does this PR introduce any user-facing change?
   
   No.
   
   ### How was this patch tested?
   
   See the GitHub Action result on this PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25878: [SPARK-29162][SQL]Simplify NOT(IsNull(x)) and NOT(IsNotNull(x))

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25878: [SPARK-29162][SQL]Simplify 
NOT(IsNull(x)) and NOT(IsNotNull(x))
URL: https://github.com/apache/spark/pull/25878#issuecomment-533766427
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AngersZhuuuu commented on issue #25878: [SPARK-29162][SQL]Simplify NOT(IsNull(x)) and NOT(IsNotNull(x))

2019-09-20 Thread GitBox
AngersZh commented on issue #25878: [SPARK-29162][SQL]Simplify 
NOT(IsNull(x)) and NOT(IsNotNull(x))
URL: https://github.com/apache/spark/pull/25878#issuecomment-533766516
 
 
   gentle ping @juliuszsompolski  
   I agree a lot that we should simplify these expression. It's really useful 
for judge query canonicalization equal. Meet this problem when I am doing some 
cache framework .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AngersZhuuuu opened a new pull request #25878: [SPARK-29162][SQL]Simplify NOT(IsNull(x)) and NOT(IsNotNull(x))

2019-09-20 Thread GitBox
AngersZh opened a new pull request #25878: [SPARK-29162][SQL]Simplify 
NOT(IsNull(x)) and NOT(IsNotNull(x))
URL: https://github.com/apache/spark/pull/25878
 
 
   ### What changes were proposed in this pull request?
   Rewrite 
   ```
   NOT isnull(x) -> isnotnull(x)
   NOT isnotnull(x)  -> isnull(x)
   ```
   
   
   
   ### Why are the changes needed?
   Make LogicalPlan more readable and  useful for query canonicalization. Make 
same condition equal when judge query canonicalization equal
   
   
   ### Does this PR introduce any user-facing change?
   NO
   
   ### How was this patch tested?
   UT
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25811: [SPARK-29111][CORE] Support snapshot/restore on KVStore

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25811: [SPARK-29111][CORE] Support 
snapshot/restore on KVStore
URL: https://github.com/apache/spark/pull/25811#issuecomment-533764924
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25811: [SPARK-29111][CORE] Support snapshot/restore on KVStore

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25811: [SPARK-29111][CORE] Support 
snapshot/restore on KVStore
URL: https://github.com/apache/spark/pull/25811#issuecomment-533764926
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/02/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25811: [SPARK-29111][CORE] Support snapshot/restore on KVStore

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25811: [SPARK-29111][CORE] Support 
snapshot/restore on KVStore
URL: https://github.com/apache/spark/pull/25811#issuecomment-533764924
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25811: [SPARK-29111][CORE] Support snapshot/restore on KVStore

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25811: [SPARK-29111][CORE] Support 
snapshot/restore on KVStore
URL: https://github.com/apache/spark/pull/25811#issuecomment-533764926
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/02/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly for reading/writing event log file

2019-09-20 Thread GitBox
SparkQA commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly for 
reading/writing event log file
URL: https://github.com/apache/spark/pull/25845#issuecomment-533764872
 
 
   **[Test build #06 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/06/testReport)**
 for PR 25845 at commit 
[`c39b06f`](https://github.com/apache/spark/commit/c39b06f4b305a094b4c132b1f6dee5dc4332a1d8).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA removed a comment on issue #25811: [SPARK-29111][CORE] Support snapshot/restore on KVStore

2019-09-20 Thread GitBox
SparkQA removed a comment on issue #25811: [SPARK-29111][CORE] Support 
snapshot/restore on KVStore
URL: https://github.com/apache/spark/pull/25811#issuecomment-533756236
 
 
   **[Test build #02 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/02/testReport)**
 for PR 25811 at commit 
[`b6e65f8`](https://github.com/apache/spark/commit/b6e65f84eaba813217bbae1264d3560da7c677e9).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25811: [SPARK-29111][CORE] Support snapshot/restore on KVStore

2019-09-20 Thread GitBox
SparkQA commented on issue #25811: [SPARK-29111][CORE] Support snapshot/restore 
on KVStore
URL: https://github.com/apache/spark/pull/25811#issuecomment-533764838
 
 
   **[Test build #02 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/02/testReport)**
 for PR 25811 at commit 
[`b6e65f8`](https://github.com/apache/spark/commit/b6e65f84eaba813217bbae1264d3560da7c677e9).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly for reading/writing event log file

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 
explicitly for reading/writing event log file
URL: https://github.com/apache/spark/pull/25845#issuecomment-533764555
 
 
   Ya. I agree with you.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR edited a comment on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly for reading/writing event log file

2019-09-20 Thread GitBox
HeartSaVioR edited a comment on issue #25845: [SPARK-29160][CORE] Use UTF-8 
explicitly for reading/writing event log file
URL: https://github.com/apache/spark/pull/25845#issuecomment-533764134
 
 
   Yeah I thought someone may say it should worth to provide 
backward-compatible option so waited for couple of days to see more voices, but 
doesn't look like so. I'll mention the change to the migration note. Thanks for 
reminding!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly for reading/writing event log file

2019-09-20 Thread GitBox
HeartSaVioR commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly 
for reading/writing event log file
URL: https://github.com/apache/spark/pull/25845#issuecomment-533764134
 
 
   Yeah I thought someone may say it should worth to provide 
backward-compatible option so waited for couple of days to see more voices, but 
doesn't look like so. I'll mention the change to the migration note.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326550365
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaProducer.scala
 ##
 @@ -18,111 +18,97 @@
 package org.apache.spark.sql.kafka010
 
 import java.{util => ju}
-import java.util.concurrent.{ConcurrentMap, ExecutionException, TimeUnit}
+import java.io.Closeable
+import java.util.concurrent.ExecutionException
 
-import com.google.common.cache._
-import com.google.common.util.concurrent.{ExecutionError, 
UncheckedExecutionException}
-import org.apache.kafka.clients.producer.KafkaProducer
 import scala.collection.JavaConverters._
 import scala.util.control.NonFatal
 
+import com.google.common.util.concurrent.{ExecutionError, 
UncheckedExecutionException}
 
 Review comment:
   We shouldn't need these things anymore since we get rid of Guava cache.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326844741
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/InternalKafkaConnectorPool.scala
 ##
 @@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.kafka010
+
+import java.{util => ju}
+import java.util.concurrent.ConcurrentHashMap
+
+import org.apache.commons.pool2.{BaseKeyedPooledObjectFactory, PooledObject, 
SwallowedExceptionListener}
+import org.apache.commons.pool2.impl.{DefaultEvictionPolicy, 
DefaultPooledObject, GenericKeyedObjectPool, GenericKeyedObjectPoolConfig}
+
+import org.apache.spark.internal.Logging
+
+/**
+ * Provides object pool for objects which is grouped by a key.
+ *
+ * This class leverages [[GenericKeyedObjectPool]] internally, hence providing 
methods based on
+ * the class, and same contract applies: after using the borrowed object, you 
must either call
+ * returnObject() if the object is healthy to return to pool, or 
invalidateObject() if the object
+ * should be destroyed.
+ *
+ * The soft capacity of pool is determined by "poolConfig.capacity" config 
value,
+ * and the pool will have reasonable default value if the value is not 
provided.
+ * (The instance will do its best effort to respect soft capacity but it can 
exceed when there's
+ * a borrowing request and there's neither free space nor idle object to 
clear.)
+ *
+ * This class guarantees that no caller will get pooled object once the object 
is borrowed and
+ * not yet returned, hence provide thread-safety usage of non-thread-safe 
objects unless caller
+ * shares the object to multiple threads.
+ */
+private[kafka010] abstract class InternalKafkaConnectorPool[K, V](
+objectFactory: ObjectFactory[K, V],
+poolConfig: PoolConfig[V],
+swallowedExceptionListener: SwallowedExceptionListener) extends Logging {
+
+  // the class is intended to have only soft capacity
+  assert(poolConfig.getMaxTotal < 0)
+
+  private val pool = {
+val internalPool = new GenericKeyedObjectPool[K, V](objectFactory, 
poolConfig)
+internalPool.setSwallowedExceptionListener(swallowedExceptionListener)
+internalPool
+  }
+
+  /**
+   * Borrows object from the pool. If there's no idle object for the key,
+   * the pool will create the object.
+   *
+   * If the pool doesn't have idle object for the key and also exceeds the 
soft capacity,
+   * pool will try to clear some of idle objects.
+   *
+   * Borrowed object must be returned by either calling returnObject or 
invalidateObject, otherwise
+   * the object will be kept in pool as active object.
+   */
+  def borrowObject(key: K, kafkaParams: ju.Map[String, Object]): V = {
+updateKafkaParamForKey(key, kafkaParams)
+
+if (size >= poolConfig.softMaxSize) {
+  logWarning("Pool exceeds its soft max size, cleaning up idle objects...")
+  pool.clearOldest()
+}
+
+pool.borrowObject(key)
+  }
+
+  /** Returns borrowed object to the pool. */
+  def returnObject(connector: V): Unit = {
+pool.returnObject(createKey(connector), connector)
+  }
+
+  /** Invalidates (destroy) borrowed object to the pool. */
+  def invalidateObject(connector: V): Unit = {
+pool.invalidateObject(createKey(connector), connector)
+  }
+
+  /** Invalidates all idle values for the key */
+  def invalidateKey(key: K): Unit = {
+pool.clear(key)
+  }
+
+  /**
+   * Closes the keyed object pool. Once the pool is closed,
+   * borrowObject will fail with [[IllegalStateException]], but returnObject 
and invalidateObject
+   * will continue to work, with returned objects destroyed on return.
+   *
+   * Also destroys idle instances in the pool.
+   */
+  def close(): Unit = {
+pool.close()
+  }
+
+  def reset(): Unit = {
+// this is the best-effort of clearing up. otherwise we should close the 
pool and create again
+// but we don't want to make it "var" only because of tests.
+pool.clear()
+  }
+
+  def numIdle: Int = pool.getNumIdle
+
+  def numIdle(key: K): Int = pool.getNumIdle(key)
+
+  def numActive: Int = 

[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326844932
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/InternalKafkaConnectorPool.scala
 ##
 @@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.kafka010
+
+import java.{util => ju}
+import java.util.concurrent.ConcurrentHashMap
+
+import org.apache.commons.pool2.{BaseKeyedPooledObjectFactory, PooledObject, 
SwallowedExceptionListener}
+import org.apache.commons.pool2.impl.{DefaultEvictionPolicy, 
DefaultPooledObject, GenericKeyedObjectPool, GenericKeyedObjectPoolConfig}
+
+import org.apache.spark.internal.Logging
+
+/**
+ * Provides object pool for objects which is grouped by a key.
+ *
+ * This class leverages [[GenericKeyedObjectPool]] internally, hence providing 
methods based on
+ * the class, and same contract applies: after using the borrowed object, you 
must either call
+ * returnObject() if the object is healthy to return to pool, or 
invalidateObject() if the object
+ * should be destroyed.
+ *
+ * The soft capacity of pool is determined by "poolConfig.capacity" config 
value,
+ * and the pool will have reasonable default value if the value is not 
provided.
+ * (The instance will do its best effort to respect soft capacity but it can 
exceed when there's
+ * a borrowing request and there's neither free space nor idle object to 
clear.)
+ *
+ * This class guarantees that no caller will get pooled object once the object 
is borrowed and
+ * not yet returned, hence provide thread-safety usage of non-thread-safe 
objects unless caller
+ * shares the object to multiple threads.
+ */
+private[kafka010] abstract class InternalKafkaConnectorPool[K, V](
+objectFactory: ObjectFactory[K, V],
+poolConfig: PoolConfig[V],
+swallowedExceptionListener: SwallowedExceptionListener) extends Logging {
+
+  // the class is intended to have only soft capacity
+  assert(poolConfig.getMaxTotal < 0)
+
+  private val pool = {
+val internalPool = new GenericKeyedObjectPool[K, V](objectFactory, 
poolConfig)
+internalPool.setSwallowedExceptionListener(swallowedExceptionListener)
+internalPool
+  }
+
+  /**
+   * Borrows object from the pool. If there's no idle object for the key,
+   * the pool will create the object.
+   *
+   * If the pool doesn't have idle object for the key and also exceeds the 
soft capacity,
+   * pool will try to clear some of idle objects.
+   *
+   * Borrowed object must be returned by either calling returnObject or 
invalidateObject, otherwise
+   * the object will be kept in pool as active object.
+   */
+  def borrowObject(key: K, kafkaParams: ju.Map[String, Object]): V = {
+updateKafkaParamForKey(key, kafkaParams)
+
+if (size >= poolConfig.softMaxSize) {
+  logWarning("Pool exceeds its soft max size, cleaning up idle objects...")
+  pool.clearOldest()
+}
+
+pool.borrowObject(key)
+  }
+
+  /** Returns borrowed object to the pool. */
+  def returnObject(connector: V): Unit = {
+pool.returnObject(createKey(connector), connector)
+  }
+
+  /** Invalidates (destroy) borrowed object to the pool. */
+  def invalidateObject(connector: V): Unit = {
+pool.invalidateObject(createKey(connector), connector)
+  }
+
+  /** Invalidates all idle values for the key */
+  def invalidateKey(key: K): Unit = {
+pool.clear(key)
+  }
+
+  /**
+   * Closes the keyed object pool. Once the pool is closed,
+   * borrowObject will fail with [[IllegalStateException]], but returnObject 
and invalidateObject
+   * will continue to work, with returned objects destroyed on return.
+   *
+   * Also destroys idle instances in the pool.
+   */
+  def close(): Unit = {
+pool.close()
+  }
+
+  def reset(): Unit = {
+// this is the best-effort of clearing up. otherwise we should close the 
pool and create again
+// but we don't want to make it "var" only because of tests.
+pool.clear()
+  }
+
+  def numIdle: Int = pool.getNumIdle
+
+  def numIdle(key: K): Int = pool.getNumIdle(key)
+
+  def numActive: Int = 

[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326845315
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/InternalKafkaConnectorPoolSuite.scala
 ##
 @@ -29,7 +29,14 @@ import org.apache.spark.SparkConf
 import org.apache.spark.sql.kafka010.KafkaDataConsumer.CacheKey
 import org.apache.spark.sql.test.SharedSparkSession
 
-class InternalKafkaConsumerPoolSuite extends SharedSparkSession {
+
 
 Review comment:
   nit: remove this line


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326844878
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/InternalKafkaConnectorPool.scala
 ##
 @@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.kafka010
+
+import java.{util => ju}
+import java.util.concurrent.ConcurrentHashMap
+
+import org.apache.commons.pool2.{BaseKeyedPooledObjectFactory, PooledObject, 
SwallowedExceptionListener}
+import org.apache.commons.pool2.impl.{DefaultEvictionPolicy, 
DefaultPooledObject, GenericKeyedObjectPool, GenericKeyedObjectPoolConfig}
+
+import org.apache.spark.internal.Logging
+
+/**
+ * Provides object pool for objects which is grouped by a key.
+ *
+ * This class leverages [[GenericKeyedObjectPool]] internally, hence providing 
methods based on
+ * the class, and same contract applies: after using the borrowed object, you 
must either call
+ * returnObject() if the object is healthy to return to pool, or 
invalidateObject() if the object
+ * should be destroyed.
+ *
+ * The soft capacity of pool is determined by "poolConfig.capacity" config 
value,
+ * and the pool will have reasonable default value if the value is not 
provided.
+ * (The instance will do its best effort to respect soft capacity but it can 
exceed when there's
+ * a borrowing request and there's neither free space nor idle object to 
clear.)
+ *
+ * This class guarantees that no caller will get pooled object once the object 
is borrowed and
+ * not yet returned, hence provide thread-safety usage of non-thread-safe 
objects unless caller
+ * shares the object to multiple threads.
+ */
+private[kafka010] abstract class InternalKafkaConnectorPool[K, V](
+objectFactory: ObjectFactory[K, V],
+poolConfig: PoolConfig[V],
+swallowedExceptionListener: SwallowedExceptionListener) extends Logging {
+
+  // the class is intended to have only soft capacity
+  assert(poolConfig.getMaxTotal < 0)
+
+  private val pool = {
+val internalPool = new GenericKeyedObjectPool[K, V](objectFactory, 
poolConfig)
+internalPool.setSwallowedExceptionListener(swallowedExceptionListener)
+internalPool
+  }
+
+  /**
+   * Borrows object from the pool. If there's no idle object for the key,
+   * the pool will create the object.
+   *
+   * If the pool doesn't have idle object for the key and also exceeds the 
soft capacity,
+   * pool will try to clear some of idle objects.
+   *
+   * Borrowed object must be returned by either calling returnObject or 
invalidateObject, otherwise
+   * the object will be kept in pool as active object.
+   */
+  def borrowObject(key: K, kafkaParams: ju.Map[String, Object]): V = {
+updateKafkaParamForKey(key, kafkaParams)
+
+if (size >= poolConfig.softMaxSize) {
+  logWarning("Pool exceeds its soft max size, cleaning up idle objects...")
+  pool.clearOldest()
+}
+
+pool.borrowObject(key)
+  }
+
+  /** Returns borrowed object to the pool. */
+  def returnObject(connector: V): Unit = {
+pool.returnObject(createKey(connector), connector)
+  }
+
+  /** Invalidates (destroy) borrowed object to the pool. */
+  def invalidateObject(connector: V): Unit = {
+pool.invalidateObject(createKey(connector), connector)
+  }
+
+  /** Invalidates all idle values for the key */
+  def invalidateKey(key: K): Unit = {
+pool.clear(key)
+  }
+
+  /**
+   * Closes the keyed object pool. Once the pool is closed,
+   * borrowObject will fail with [[IllegalStateException]], but returnObject 
and invalidateObject
+   * will continue to work, with returned objects destroyed on return.
+   *
+   * Also destroys idle instances in the pool.
+   */
+  def close(): Unit = {
+pool.close()
+  }
+
+  def reset(): Unit = {
+// this is the best-effort of clearing up. otherwise we should close the 
pool and create again
+// but we don't want to make it "var" only because of tests.
+pool.clear()
+  }
+
+  def numIdle: Int = pool.getNumIdle
+
+  def numIdle(key: K): Int = pool.getNumIdle(key)
+
+  def numActive: Int = 

[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326844478
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/InternalKafkaConnectorPool.scala
 ##
 @@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.kafka010
+
+import java.{util => ju}
+import java.util.concurrent.ConcurrentHashMap
+
+import org.apache.commons.pool2.{BaseKeyedPooledObjectFactory, PooledObject, 
SwallowedExceptionListener}
+import org.apache.commons.pool2.impl.{DefaultEvictionPolicy, 
DefaultPooledObject, GenericKeyedObjectPool, GenericKeyedObjectPoolConfig}
+
+import org.apache.spark.internal.Logging
+
+/**
+ * Provides object pool for objects which is grouped by a key.
+ *
+ * This class leverages [[GenericKeyedObjectPool]] internally, hence providing 
methods based on
+ * the class, and same contract applies: after using the borrowed object, you 
must either call
+ * returnObject() if the object is healthy to return to pool, or 
invalidateObject() if the object
+ * should be destroyed.
+ *
+ * The soft capacity of pool is determined by "poolConfig.capacity" config 
value,
+ * and the pool will have reasonable default value if the value is not 
provided.
+ * (The instance will do its best effort to respect soft capacity but it can 
exceed when there's
+ * a borrowing request and there's neither free space nor idle object to 
clear.)
+ *
+ * This class guarantees that no caller will get pooled object once the object 
is borrowed and
+ * not yet returned, hence provide thread-safety usage of non-thread-safe 
objects unless caller
+ * shares the object to multiple threads.
+ */
+private[kafka010] abstract class InternalKafkaConnectorPool[K, V](
+objectFactory: ObjectFactory[K, V],
+poolConfig: PoolConfig[V],
+swallowedExceptionListener: SwallowedExceptionListener) extends Logging {
+
+  // the class is intended to have only soft capacity
+  assert(poolConfig.getMaxTotal < 0)
+
+  private val pool = {
+val internalPool = new GenericKeyedObjectPool[K, V](objectFactory, 
poolConfig)
+internalPool.setSwallowedExceptionListener(swallowedExceptionListener)
+internalPool
+  }
+
+  /**
+   * Borrows object from the pool. If there's no idle object for the key,
+   * the pool will create the object.
+   *
+   * If the pool doesn't have idle object for the key and also exceeds the 
soft capacity,
+   * pool will try to clear some of idle objects.
+   *
+   * Borrowed object must be returned by either calling returnObject or 
invalidateObject, otherwise
+   * the object will be kept in pool as active object.
+   */
+  def borrowObject(key: K, kafkaParams: ju.Map[String, Object]): V = {
+updateKafkaParamForKey(key, kafkaParams)
+
+if (size >= poolConfig.softMaxSize) {
+  logWarning("Pool exceeds its soft max size, cleaning up idle objects...")
+  pool.clearOldest()
+}
+
+pool.borrowObject(key)
+  }
+
+  /** Returns borrowed object to the pool. */
+  def returnObject(connector: V): Unit = {
+pool.returnObject(createKey(connector), connector)
+  }
+
+  /** Invalidates (destroy) borrowed object to the pool. */
+  def invalidateObject(connector: V): Unit = {
+pool.invalidateObject(createKey(connector), connector)
+  }
+
+  /** Invalidates all idle values for the key */
+  def invalidateKey(key: K): Unit = {
+pool.clear(key)
+  }
+
+  /**
+   * Closes the keyed object pool. Once the pool is closed,
+   * borrowObject will fail with [[IllegalStateException]], but returnObject 
and invalidateObject
+   * will continue to work, with returned objects destroyed on return.
+   *
+   * Also destroys idle instances in the pool.
+   */
+  def close(): Unit = {
+pool.close()
+  }
+
+  def reset(): Unit = {
+// this is the best-effort of clearing up. otherwise we should close the 
pool and create again
+// but we don't want to make it "var" only because of tests.
+pool.clear()
+  }
+
+  def numIdle: Int = pool.getNumIdle
+
+  def numIdle(key: K): Int = pool.getNumIdle(key)
+
+  def numActive: Int = 

[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326845097
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/InternalKafkaProducerPool.scala
 ##
 @@ -0,0 +1,72 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.kafka010
+
+import java.{util => ju}
+
+import scala.collection.JavaConverters._
+
+import org.apache.commons.pool2.PooledObject
+
+import org.apache.spark.SparkConf
+import org.apache.spark.sql.kafka010.InternalKafkaProducerPool.CacheKey
+
+private[kafka010] class InternalKafkaProducerPool(
+objectFactory: ProducerObjectFactory,
+poolConfig: ProducerPoolConfig)
+  extends InternalKafkaConnectorPool[CacheKey, CachedKafkaProducer](
+  objectFactory,
+  poolConfig,
+  new CustomSwallowedExceptionListener("producer")) {
+
+  def this(conf: SparkConf) = {
+this(new ProducerObjectFactory, new ProducerPoolConfig(conf))
+  }
+
+  protected def createKey(producer: CachedKafkaProducer): CacheKey = {
+producer.kafkaParams.asScala.toSeq.sortBy(x => x._1)
 
 Review comment:
   Let's use `InternalKafkaProducerPool.toCacheKey` to unify the way to create 
CacheKey.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326845141
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaDataConsumer.scala
 ##
 @@ -619,6 +619,10 @@ private[kafka010] object KafkaDataConsumer extends 
Logging {
 new KafkaDataConsumer(topicPartition, kafkaParams, consumerPool, 
fetchedDataPool)
   }
 
+  private[kafka010] def clear(): Unit = {
+consumerPool.reset()
 
 Review comment:
   Let's call `fetchedDataPool.reset()` as well.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326844638
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/InternalKafkaConnectorPool.scala
 ##
 @@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.kafka010
+
+import java.{util => ju}
+import java.util.concurrent.ConcurrentHashMap
+
+import org.apache.commons.pool2.{BaseKeyedPooledObjectFactory, PooledObject, 
SwallowedExceptionListener}
+import org.apache.commons.pool2.impl.{DefaultEvictionPolicy, 
DefaultPooledObject, GenericKeyedObjectPool, GenericKeyedObjectPoolConfig}
+
+import org.apache.spark.internal.Logging
+
+/**
+ * Provides object pool for objects which is grouped by a key.
+ *
+ * This class leverages [[GenericKeyedObjectPool]] internally, hence providing 
methods based on
+ * the class, and same contract applies: after using the borrowed object, you 
must either call
+ * returnObject() if the object is healthy to return to pool, or 
invalidateObject() if the object
+ * should be destroyed.
+ *
+ * The soft capacity of pool is determined by "poolConfig.capacity" config 
value,
+ * and the pool will have reasonable default value if the value is not 
provided.
+ * (The instance will do its best effort to respect soft capacity but it can 
exceed when there's
+ * a borrowing request and there's neither free space nor idle object to 
clear.)
+ *
+ * This class guarantees that no caller will get pooled object once the object 
is borrowed and
+ * not yet returned, hence provide thread-safety usage of non-thread-safe 
objects unless caller
+ * shares the object to multiple threads.
+ */
+private[kafka010] abstract class InternalKafkaConnectorPool[K, V](
 
 Review comment:
   For other reviewers: this is pretty same as previous 
InternalKafkaConsumerPool, except
   
   1. It brings some abstract method to deal with extracting key from object.
   2. It replaces "consumer" related words in javadoc/code to common one to 
apply this to both consumer and producer.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326844478
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/InternalKafkaConnectorPool.scala
 ##
 @@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.kafka010
+
+import java.{util => ju}
+import java.util.concurrent.ConcurrentHashMap
+
+import org.apache.commons.pool2.{BaseKeyedPooledObjectFactory, PooledObject, 
SwallowedExceptionListener}
+import org.apache.commons.pool2.impl.{DefaultEvictionPolicy, 
DefaultPooledObject, GenericKeyedObjectPool, GenericKeyedObjectPoolConfig}
+
+import org.apache.spark.internal.Logging
+
+/**
+ * Provides object pool for objects which is grouped by a key.
+ *
+ * This class leverages [[GenericKeyedObjectPool]] internally, hence providing 
methods based on
+ * the class, and same contract applies: after using the borrowed object, you 
must either call
+ * returnObject() if the object is healthy to return to pool, or 
invalidateObject() if the object
+ * should be destroyed.
+ *
+ * The soft capacity of pool is determined by "poolConfig.capacity" config 
value,
+ * and the pool will have reasonable default value if the value is not 
provided.
+ * (The instance will do its best effort to respect soft capacity but it can 
exceed when there's
+ * a borrowing request and there's neither free space nor idle object to 
clear.)
+ *
+ * This class guarantees that no caller will get pooled object once the object 
is borrowed and
+ * not yet returned, hence provide thread-safety usage of non-thread-safe 
objects unless caller
+ * shares the object to multiple threads.
+ */
+private[kafka010] abstract class InternalKafkaConnectorPool[K, V](
+objectFactory: ObjectFactory[K, V],
+poolConfig: PoolConfig[V],
+swallowedExceptionListener: SwallowedExceptionListener) extends Logging {
+
+  // the class is intended to have only soft capacity
+  assert(poolConfig.getMaxTotal < 0)
+
+  private val pool = {
+val internalPool = new GenericKeyedObjectPool[K, V](objectFactory, 
poolConfig)
+internalPool.setSwallowedExceptionListener(swallowedExceptionListener)
+internalPool
+  }
+
+  /**
+   * Borrows object from the pool. If there's no idle object for the key,
+   * the pool will create the object.
+   *
+   * If the pool doesn't have idle object for the key and also exceeds the 
soft capacity,
+   * pool will try to clear some of idle objects.
+   *
+   * Borrowed object must be returned by either calling returnObject or 
invalidateObject, otherwise
+   * the object will be kept in pool as active object.
+   */
+  def borrowObject(key: K, kafkaParams: ju.Map[String, Object]): V = {
+updateKafkaParamForKey(key, kafkaParams)
+
+if (size >= poolConfig.softMaxSize) {
+  logWarning("Pool exceeds its soft max size, cleaning up idle objects...")
+  pool.clearOldest()
+}
+
+pool.borrowObject(key)
+  }
+
+  /** Returns borrowed object to the pool. */
+  def returnObject(connector: V): Unit = {
+pool.returnObject(createKey(connector), connector)
+  }
+
+  /** Invalidates (destroy) borrowed object to the pool. */
+  def invalidateObject(connector: V): Unit = {
+pool.invalidateObject(createKey(connector), connector)
+  }
+
+  /** Invalidates all idle values for the key */
+  def invalidateKey(key: K): Unit = {
+pool.clear(key)
+  }
+
+  /**
+   * Closes the keyed object pool. Once the pool is closed,
+   * borrowObject will fail with [[IllegalStateException]], but returnObject 
and invalidateObject
+   * will continue to work, with returned objects destroyed on return.
+   *
+   * Also destroys idle instances in the pool.
+   */
+  def close(): Unit = {
+pool.close()
+  }
+
+  def reset(): Unit = {
+// this is the best-effort of clearing up. otherwise we should close the 
pool and create again
+// but we don't want to make it "var" only because of tests.
+pool.clear()
+  }
+
+  def numIdle: Int = pool.getNumIdle
+
+  def numIdle(key: K): Int = pool.getNumIdle(key)
+
+  def numActive: Int = 

[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326845411
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaDataConsumerSuite.scala
 ##
 @@ -148,7 +148,7 @@ class KafkaDataConsumerSuite extends SharedSparkSession 
with PrivateMethodTester
 
 @volatile var error: Throwable = null
 
-def consume(i: Int): Unit = {
+def consume(): Unit = {
 
 Review comment:
   For other reviewers: this removes unused parameter.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer

2019-09-20 Thread GitBox
HeartSaVioR commented on a change in pull request #25853: [SPARK-21869][SS] 
Apply Apache Commons Pool to Kafka producer
URL: https://github.com/apache/spark/pull/25853#discussion_r326844184
 
 

 ##
 File path: 
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaProducer.scala
 ##
 @@ -18,111 +18,97 @@
 package org.apache.spark.sql.kafka010
 
 import java.{util => ju}
-import java.util.concurrent.{ConcurrentMap, ExecutionException, TimeUnit}
+import java.io.Closeable
+import java.util.concurrent.ExecutionException
 
-import com.google.common.cache._
-import com.google.common.util.concurrent.{ExecutionError, 
UncheckedExecutionException}
-import org.apache.kafka.clients.producer.KafkaProducer
 import scala.collection.JavaConverters._
 import scala.util.control.NonFatal
 
+import com.google.common.util.concurrent.{ExecutionError, 
UncheckedExecutionException}
+import org.apache.kafka.clients.producer.{Callback, KafkaProducer, 
ProducerRecord}
+
 import org.apache.spark.SparkEnv
 import org.apache.spark.internal.Logging
 import org.apache.spark.kafka010.{KafkaConfigUpdater, KafkaRedactionUtil}
+import org.apache.spark.sql.kafka010.InternalKafkaProducerPool._
+import org.apache.spark.util.ShutdownHookManager
 
-private[kafka010] object CachedKafkaProducer extends Logging {
+private[kafka010] class CachedKafkaProducer(val kafkaParams: ju.Map[String, 
Object])
+  extends Closeable with Logging {
 
   private type Producer = KafkaProducer[Array[Byte], Array[Byte]]
 
-  private val defaultCacheExpireTimeout = TimeUnit.MINUTES.toMillis(10)
-
-  private lazy val cacheExpireTimeout: Long = Option(SparkEnv.get)
-.map(_.conf.get(PRODUCER_CACHE_TIMEOUT))
-.getOrElse(defaultCacheExpireTimeout)
+  private val producer = createProducer()
 
-  private val cacheLoader = new CacheLoader[Seq[(String, Object)], Producer] {
-override def load(config: Seq[(String, Object)]): Producer = {
-  createKafkaProducer(config)
+  private def createProducer(): Producer = {
+val producer: Producer = new Producer(kafkaParams)
+if (log.isDebugEnabled()) {
+  val redactedParamsSeq = 
KafkaRedactionUtil.redactParams(toCacheKey(kafkaParams))
+  logDebug(s"Created a new instance of kafka producer for 
$redactedParamsSeq.")
 }
+producer
   }
 
-  private val removalListener = new RemovalListener[Seq[(String, Object)], 
Producer]() {
-override def onRemoval(
-notification: RemovalNotification[Seq[(String, Object)], Producer]): 
Unit = {
-  val paramsSeq: Seq[(String, Object)] = notification.getKey
-  val producer: Producer = notification.getValue
-  if (log.isDebugEnabled()) {
-val redactedParamsSeq = KafkaRedactionUtil.redactParams(paramsSeq)
-logDebug(s"Evicting kafka producer $producer params: 
$redactedParamsSeq, " +
-  s"due to ${notification.getCause}")
+  override def close(): Unit = {
+try {
+  if (log.isInfoEnabled()) {
+val redactedParamsSeq = 
KafkaRedactionUtil.redactParams(toCacheKey(kafkaParams))
+logInfo(s"Closing the KafkaProducer with params: 
${redactedParamsSeq.mkString("\n")}.")
   }
-  close(paramsSeq, producer)
+  producer.close()
+} catch {
+  case NonFatal(e) => logWarning("Error while closing kafka producer.", e)
 }
   }
 
-  private lazy val guavaCache: LoadingCache[Seq[(String, Object)], Producer] =
-CacheBuilder.newBuilder().expireAfterAccess(cacheExpireTimeout, 
TimeUnit.MILLISECONDS)
-  .removalListener(removalListener)
-  .build[Seq[(String, Object)], Producer](cacheLoader)
+  def send(record: ProducerRecord[Array[Byte], Array[Byte]], callback: 
Callback): Unit = {
+producer.send(record, callback)
+  }
 
-  private def createKafkaProducer(paramsSeq: Seq[(String, Object)]): Producer 
= {
-val kafkaProducer: Producer = new Producer(paramsSeq.toMap.asJava)
-if (log.isDebugEnabled()) {
-  val redactedParamsSeq = KafkaRedactionUtil.redactParams(paramsSeq)
-  logDebug(s"Created a new instance of KafkaProducer for 
$redactedParamsSeq.")
+  def flush(): Unit = {
+producer.flush()
+  }
+}
+
+private[kafka010] object CachedKafkaProducer extends Logging {
+
+  private val sparkConf = SparkEnv.get.conf
+  private val producerPool = new InternalKafkaProducerPool(sparkConf)
+
+  ShutdownHookManager.addShutdownHook { () =>
+try {
+  producerPool.close()
+} catch {
+  case e: Throwable =>
+logWarning("Ignoring exception while shutting down pool from shutdown 
hook", e)
 }
-kafkaProducer
   }
 
   /**
* Get a cached KafkaProducer for a given configuration. If matching 
KafkaProducer doesn't
* exist, a new KafkaProducer will be created. KafkaProducer is thread safe, 
it is best to keep
* one instance per specified kafkaParams.
*/
-  private[kafka010] def getOrCreate(kafkaParams: ju.Map[String, Object]): 
Producer = {
-val updatedKafkaProducerConfiguration =
+  def 

[GitHub] [spark] AmplabJenkins removed a comment on issue #25830: [SPARK-29140][SQL] Handle parameters having "array" of javaType properly in splitAggregateExpressions

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25830: [SPARK-29140][SQL] Handle 
parameters having "array" of javaType properly in splitAggregateExpressions
URL: https://github.com/apache/spark/pull/25830#issuecomment-533763719
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25830: [SPARK-29140][SQL] Handle parameters having "array" of javaType properly in splitAggregateExpressions

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25830: [SPARK-29140][SQL] Handle 
parameters having "array" of javaType properly in splitAggregateExpressions
URL: https://github.com/apache/spark/pull/25830#issuecomment-533763721
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/111096/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun closed pull request #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
dongjoon-hyun closed pull request #25874: [SPARK-28772][BUILD][MLLIB] Update 
breeze to 1.0
URL: https://github.com/apache/spark/pull/25874
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25830: [SPARK-29140][SQL] Handle parameters having "array" of javaType properly in splitAggregateExpressions

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25830: [SPARK-29140][SQL] Handle parameters 
having "array" of javaType properly in splitAggregateExpressions
URL: https://github.com/apache/spark/pull/25830#issuecomment-533763719
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25830: [SPARK-29140][SQL] Handle parameters having "array" of javaType properly in splitAggregateExpressions

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25830: [SPARK-29140][SQL] Handle parameters 
having "array" of javaType properly in splitAggregateExpressions
URL: https://github.com/apache/spark/pull/25830#issuecomment-533763721
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/111096/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update 
breeze to 1.0
URL: https://github.com/apache/spark/pull/25874#issuecomment-533763726
 
 
   Merged to master. Thank you, @srowen .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun closed pull request #25865: [SPARK-29187][SQL] Return null from `date_part()` for the null `field`

2019-09-20 Thread GitBox
dongjoon-hyun closed pull request #25865: [SPARK-29187][SQL] Return null from 
`date_part()` for the null `field`
URL: https://github.com/apache/spark/pull/25865
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] imback82 commented on a change in pull request #25771: [SPARK-28970][SQL] Implement USE CATALOG/NAMESPACE for Data Source V2

2019-09-20 Thread GitBox
imback82 commented on a change in pull request #25771: [SPARK-28970][SQL] 
Implement USE CATALOG/NAMESPACE for Data Source V2
URL: https://github.com/apache/spark/pull/25771#discussion_r326845256
 
 

 ##
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceResolution.scala
 ##
 @@ -210,6 +197,17 @@ case class DataSourceResolution(
   }
   ShowTablesCommand(Some(namespace.quoted), pattern)
   }
+
+case UseCatalogAndNamespaceStatement(catalogName, namespace) =>
+  catalogName match {
+case Some(c) =>
+  validateCatalog(c)
+  UseCatalogAndNamespace(catalogManager, Some(c), namespace)
+case None =>
+  require(namespace.nonEmpty)
+  val CurrentCatalogAndNamespace(catalog, ns) = namespace.get
+  UseCatalogAndNamespace(catalogManager, Some(catalog.name()), 
Some(ns))
 
 Review comment:
   Actually, this may not be correct if the catalog is `V2SessionCatalog`. The 
V1 behavior is that it calls `SessionCatalog.setCurrentDatabase(databaseName)`. 
So if we don't fallback to V1, any code that access 
`SessionCatalog.getCurrentDatabase` can be broken.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA removed a comment on issue #25830: [SPARK-29140][SQL] Handle parameters having "array" of javaType properly in splitAggregateExpressions

2019-09-20 Thread GitBox
SparkQA removed a comment on issue #25830: [SPARK-29140][SQL] Handle parameters 
having "array" of javaType properly in splitAggregateExpressions
URL: https://github.com/apache/spark/pull/25830#issuecomment-533743421
 
 
   **[Test build #111096 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/111096/testReport)**
 for PR 25830 at commit 
[`4c00a2b`](https://github.com/apache/spark/commit/4c00a2b5386f318fbbded9e7f7e1daee7782758b).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25865: [SPARK-29187][SQL] Return null from `date_part()` for the null `field`

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25865: [SPARK-29187][SQL] Return null from 
`date_part()` for the null `field`
URL: https://github.com/apache/spark/pull/25865#issuecomment-533763676
 
 
   Also, thank you for review, @adrian-wang .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25830: [SPARK-29140][SQL] Handle parameters having "array" of javaType properly in splitAggregateExpressions

2019-09-20 Thread GitBox
SparkQA commented on issue #25830: [SPARK-29140][SQL] Handle parameters having 
"array" of javaType properly in splitAggregateExpressions
URL: https://github.com/apache/spark/pull/25830#issuecomment-533763611
 
 
   **[Test build #111096 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/111096/testReport)**
 for PR 25830 at commit 
[`4c00a2b`](https://github.com/apache/spark/commit/4c00a2b5386f318fbbded9e7f7e1daee7782758b).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error 
messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533763283
 
 
   BTW, thank you for making a PR, @jeff303 .


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error 
messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533763244
 
 
   It would be good to ping @holdenk since she is the reporter of this issue.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
dongjoon-hyun commented on a change in pull request #25807: [SPARK-25153][SQL] 
Improve error messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#discussion_r326844592
 
 

 ##
 File path: sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala
 ##
 @@ -256,8 +256,13 @@ class Dataset[T] private[sql](
   private[sql] def resolve(colName: String): NamedExpression = {
 queryExecution.analyzed.resolveQuoted(colName, 
sparkSession.sessionState.analyzer.resolver)
   .getOrElse {
-throw new AnalysisException(
-  s"""Cannot resolve column name "$colName" among 
(${schema.fieldNames.mkString(", ")})""")
+val fields = schema.fieldNames
+val extraMsg = if 
(fields.exists(sparkSession.sessionState.analyzer.resolver(_, colName))) {
 
 Review comment:
   Please declare `val resolver = sparkSession.sessionState.analyzer.resolver` 
before line 257 because this is used twice.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25807: [SPARK-25153][SQL] Improve 
error messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533762438
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16187/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25807: [SPARK-25153][SQL] Improve error 
messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533762437
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25807: [SPARK-25153][SQL] Improve error 
messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533762438
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16187/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25807: [SPARK-25153][SQL] Improve 
error messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533762437
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
dongjoon-hyun commented on a change in pull request #25807: [SPARK-25153][SQL] 
Improve error messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#discussion_r326844541
 
 

 ##
 File path: sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala
 ##
 @@ -1841,6 +1842,24 @@ class DatasetSuite extends QueryTest with 
SharedSparkSession {
 val instant = java.time.Instant.parse("2019-03-30T09:54:00Z")
 assert(spark.range(1).map { _ => instant }.head === instant)
   }
+
+  val dotColumnTestModes = Table(
+("caseSensitive", "colName"),
+("true", "field.1"),
+("false", "Field.1")
+  )
+
+  test("SPARK-25153: Improve error messages for columns with dots/periods") {
+forAll(dotColumnTestModes) { (caseSensitive, colName) =>
+  val ds = Seq(SpecialCharClass("1", "2")).toDS
+  withSQLConf(SQLConf.CASE_SENSITIVE.key -> caseSensitive) {
+val errorMsg = intercept[AnalysisException] {
+  ds(colName)
+}
+assert(errorMsg.getMessage.contains(s"did you mean to quote the 
`${colName}` column?"))
 
 Review comment:
   nit. `${colName}` -> `$colName`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
SparkQA commented on issue #25807: [SPARK-25153][SQL] Improve error messages 
for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533762372
 
 
   **[Test build #05 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/05/testReport)**
 for PR 25807 at commit 
[`ed4d2a0`](https://github.com/apache/spark/commit/ed4d2a0680d9caadbc2bf7050f59e2a3b3f1b4b1).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
dongjoon-hyun commented on a change in pull request #25807: [SPARK-25153][SQL] 
Improve error messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#discussion_r326844486
 
 

 ##
 File path: sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala
 ##
 @@ -256,8 +256,13 @@ class Dataset[T] private[sql](
   private[sql] def resolve(colName: String): NamedExpression = {
 queryExecution.analyzed.resolveQuoted(colName, 
sparkSession.sessionState.analyzer.resolver)
   .getOrElse {
-throw new AnalysisException(
-  s"""Cannot resolve column name "$colName" among 
(${schema.fieldNames.mkString(", ")})""")
+val fields = schema.fieldNames
+val extraMsg = if 
(fields.exists(sparkSession.sessionState.analyzer.resolver(_, colName))) {
+  s"; did you mean to quote the `${colName}` column?"
 
 Review comment:
   nit. `${colName}` -> `$colName`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error 
messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533762035
 
 
   Oh, sorry for missing the ping here, @HyukjinKwon . I'll review today~


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update 
breeze to 1.0
URL: https://github.com/apache/spark/pull/25874#issuecomment-533762051
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/00/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update 
breeze to 1.0
URL: https://github.com/apache/spark/pull/25874#issuecomment-533762048
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error messages for columns with dots/periods

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25807: [SPARK-25153][SQL] Improve error 
messages for columns with dots/periods
URL: https://github.com/apache/spark/pull/25807#issuecomment-533762043
 
 
   Retest this please.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25874: [SPARK-28772][BUILD][MLLIB] 
Update breeze to 1.0
URL: https://github.com/apache/spark/pull/25874#issuecomment-533762051
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/00/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25874: [SPARK-28772][BUILD][MLLIB] 
Update breeze to 1.0
URL: https://github.com/apache/spark/pull/25874#issuecomment-533762048
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA removed a comment on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
SparkQA removed a comment on issue #25874: [SPARK-28772][BUILD][MLLIB] Update 
breeze to 1.0
URL: https://github.com/apache/spark/pull/25874#issuecomment-533748253
 
 
   **[Test build #00 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/00/testReport)**
 for PR 25874 at commit 
[`1851e0d`](https://github.com/apache/spark/commit/1851e0d3b88f1bec165be7a39d70d2f3660f8b29).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun edited a comment on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly for reading/writing event log file

2019-09-20 Thread GitBox
dongjoon-hyun edited a comment on issue #25845: [SPARK-29160][CORE] Use UTF-8 
explicitly for reading/writing event log file
URL: https://github.com/apache/spark/pull/25845#issuecomment-533761894
 
 
   For me, the only remaining thing looks like a note at the migration guide to 
give prior warning to users, isn't it? It would be great if this PR includes 
that.
   - https://github.com/apache/spark/pull/25845#issuecomment-532971687
   - https://github.com/apache/spark/pull/25845#issuecomment-532987837
   - https://github.com/apache/spark/pull/25845#issuecomment-532994719


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 1.0

2019-09-20 Thread GitBox
SparkQA commented on issue #25874: [SPARK-28772][BUILD][MLLIB] Update breeze to 
1.0
URL: https://github.com/apache/spark/pull/25874#issuecomment-533761934
 
 
   **[Test build #00 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/00/testReport)**
 for PR 25874 at commit 
[`1851e0d`](https://github.com/apache/spark/commit/1851e0d3b88f1bec165be7a39d70d2f3660f8b29).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 explicitly for reading/writing event log file

2019-09-20 Thread GitBox
dongjoon-hyun commented on issue #25845: [SPARK-29160][CORE] Use UTF-8 
explicitly for reading/writing event log file
URL: https://github.com/apache/spark/pull/25845#issuecomment-533761894
 
 
   For me, the only remaining thing is a note at the migration guide to give 
prior warning to users, isn't it? It would be great if this PR includes that.
   - https://github.com/apache/spark/pull/25845#issuecomment-532971687
   - https://github.com/apache/spark/pull/25845#issuecomment-532987837
   - https://github.com/apache/spark/pull/25845#issuecomment-532994719


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun commented on a change in pull request #25816: [SPARK-29107][SQL][TESTS] Port window.sql (Part 1)

2019-09-20 Thread GitBox
dongjoon-hyun commented on a change in pull request #25816: 
[SPARK-29107][SQL][TESTS] Port window.sql (Part 1)
URL: https://github.com/apache/spark/pull/25816#discussion_r326844126
 
 

 ##
 File path: sql/core/src/test/resources/sql-tests/inputs/pgSQL/window_part1.sql
 ##
 @@ -0,0 +1,343 @@
+-- Portions Copyright (c) 1996-2019, PostgreSQL Global Development Group
+--
+-- Window Functions Testing
+-- 
https://github.com/postgres/postgres/blob/REL_12_BETA3/src/test/regress/sql/window.sql
 
 Review comment:
   Unfortunately,`REL_12_BETA4` is released.
   > 
https://github.com/postgres/postgres/blob/REL_12_BETA4/src/test/regress/sql/window.sql


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method bytecode size in BenchmarkQueryTest

2019-09-20 Thread GitBox
SparkQA commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method 
bytecode size in BenchmarkQueryTest
URL: https://github.com/apache/spark/pull/25788#issuecomment-533761615
 
 
   **[Test build #04 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/04/testReport)**
 for PR 25788 at commit 
[`9f32999`](https://github.com/apache/spark/commit/9f329992a7baec1ae04033b2e75b47f7210fb94f).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25788: [SPARK-29084][SQL][TESTS] Check method bytecode size in BenchmarkQueryTest

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25788: [SPARK-29084][SQL][TESTS] 
Check method bytecode size in BenchmarkQueryTest
URL: https://github.com/apache/spark/pull/25788#issuecomment-533761321
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method bytecode size in BenchmarkQueryTest

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method 
bytecode size in BenchmarkQueryTest
URL: https://github.com/apache/spark/pull/25788#issuecomment-533761321
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #25788: [SPARK-29084][SQL][TESTS] Check method bytecode size in BenchmarkQueryTest

2019-09-20 Thread GitBox
AmplabJenkins removed a comment on issue #25788: [SPARK-29084][SQL][TESTS] 
Check method bytecode size in BenchmarkQueryTest
URL: https://github.com/apache/spark/pull/25788#issuecomment-533761324
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16186/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method bytecode size in BenchmarkQueryTest

2019-09-20 Thread GitBox
AmplabJenkins commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method 
bytecode size in BenchmarkQueryTest
URL: https://github.com/apache/spark/pull/25788#issuecomment-533761324
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/16186/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] maropu commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method bytecode size in BenchmarkQueryTest

2019-09-20 Thread GitBox
maropu commented on issue #25788: [SPARK-29084][SQL][TESTS] Check method 
bytecode size in BenchmarkQueryTest
URL: https://github.com/apache/spark/pull/25788#issuecomment-533761190
 
 
   retest this please


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] dongjoon-hyun closed pull request #25873: [SPARK-29192][TESTS] Extend BenchmarkBase to write JDK9+ results separately

2019-09-20 Thread GitBox
dongjoon-hyun closed pull request #25873: [SPARK-29192][TESTS] Extend 
BenchmarkBase to write JDK9+ results separately
URL: https://github.com/apache/spark/pull/25873
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] xuanyuanking commented on issue #25768: [SPARK-29063][SQL] Modify fillValue approach to support joined dataframe

2019-09-20 Thread GitBox
xuanyuanking commented on issue #25768: [SPARK-29063][SQL] Modify fillValue 
approach to support joined dataframe
URL: https://github.com/apache/spark/pull/25768#issuecomment-533761030
 
 
   Thanks for reviewing.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



  1   2   3   4   5   6   7   8   9   10   >