[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-27 Thread QiangCai
Github user QiangCai commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-167393002
  
@srowen I have new another pull request. This pull request will be closed.
another pull request https://github.com/apache/spark/pull/10487


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-27 Thread QiangCai
Github user QiangCai closed the pull request at:

https://github.com/apache/spark/pull/10310


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-24 Thread QiangCai
Github user QiangCai commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-167117473
  
@srowen I maybe have made a mistake in git bash. Can I new another pull 
request to resolved this problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-24 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-167118628
  
I think you just need to rebase from master and force-push the result, but 
do what you need to.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-23 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-166918642
  
@QiangCai thanks that looks good, but this needs a rebase now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-23 Thread QiangCai
Github user QiangCai commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-166912120
  
@srowen I have modified all the code, and try to keep them to be the same.

At first, the var numPartsToTry and partsScanned  have been set to Long.
Secondly,  because  var p should be Seq[Int], so i have added to.Int  in 
the until statement  


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-20 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-166096808
  
@QiangCai I think that technically `var partsScanned = 0` in `take` is a 
problem since it's incremented by `numPartsToTry` and could overflow causing 
`partsScanned < totalParts` to always be true. This is a very rare case where 
`numPartsToTry` is large and `totalParts` is close to the max integer, and `n` 
is very large.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-19 Thread QiangCai
Github user QiangCai commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-165964135
  
Yes. I will check all at first.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-18 Thread QiangCai
Github user QiangCai commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-165946596
  
Yes. I will change all.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-16 Thread rxin
Github user rxin commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-165376341
  
Can you also change take in RDD?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-16 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-165045403
  
**[Test build #2220 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2220/consoleFull)**
 for PR 10310 at commit 
[`56f151a`](https://github.com/apache/spark/commit/56f151a6193a8eee6bafa6fb224648271254b34f).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-16 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-165026607
  
**[Test build #2220 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2220/consoleFull)**
 for PR 10310 at commit 
[`56f151a`](https://github.com/apache/spark/commit/56f151a6193a8eee6bafa6fb224648271254b34f).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread 3ourroom
Github user 3ourroom commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-164752595
  

NAVER - http://www.naver.com/


3ourr...@naver.com 님께 보내신 메일  이 다음과 
같은 이유로 전송 실패했습니다.



받는 사람이 회원님의 메일을 수신차단 하였습니다. 






---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-164752297
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/10310#discussion_r47633144
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkPlan.scala ---
@@ -206,7 +206,7 @@ abstract class SparkPlan extends QueryPlan[SparkPlan] 
with Logging with Serializ
   numPartsToTry = math.max(0, numPartsToTry)  // guard against 
negative num of partitions
 
   val left = n - buf.size
-  val p = partsScanned until math.min(partsScanned + numPartsToTry, 
totalParts)
+  val p = partsScanned until math.min(partsScanned.toLong + 
numPartsToTry, totalParts).toInt
--- End diff --

Sounds good to me.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-164758610
  
**[Test build #2217 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2217/consoleFull)**
 for PR 10310 at commit 
[`37acd54`](https://github.com/apache/spark/commit/37acd541e0447a46a3b7718a688c4312e86d13e6).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread QiangCai
GitHub user QiangCai opened a pull request:

https://github.com/apache/spark/pull/10310

[SPARK-12340][SQL] Fix overstep the bounds of Int in SparkPlan.executeTake

Modifies partsScanned to partsScanned.toLong and  chang result of math.min 
to Int.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/QiangCai/spark QiangCai

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/10310.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #10310


commit 37acd541e0447a46a3b7718a688c4312e86d13e6
Author: QiangCai 
Date:   2015-12-15T12:24:38Z

 Fix overstep the bounds of Int in SparkPlan.executeTake

Modifies partsScanned to partsScanned.toLong and  chang result of math.min 
to Int.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread 3ourroom
Github user 3ourroom commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-164752088
  

NAVER - http://www.naver.com/


3ourr...@naver.com 님께 보내신 메일 <[spark] [SPARK-12340][SQL] Fix 
overstep the bounds of Int in SparkPlan.executeTake (#10310)> 이 다음과 
같은 이유로 전송 실패했습니다.



받는 사람이 회원님의 메일을 수신차단 하였습니다. 






---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-164785707
  
**[Test build #2217 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2217/consoleFull)**
 for PR 10310 at commit 
[`37acd54`](https://github.com/apache/spark/commit/37acd541e0447a46a3b7718a688c4312e86d13e6).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds the following public classes _(experimental)_:\n  * 
`case class WrapOption(child: Expression) extends UnaryExpression `\n


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread QiangCai
Github user QiangCai commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-164967364
  
Tthe number of  partitions which has scanned is  the size of seq p, not 
numPartsToTry.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12340][SQL] Fix overstep the bounds of ...

2015-12-15 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/10310#issuecomment-165026356
  
Yes I think that also makes sense, in the case where we hit the limit of 
`totalParts`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org