[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-02-01 Thread huaxingao
Github user huaxingao closed the pull request at:

https://github.com/apache/spark/pull/10750


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-02-01 Thread huaxingao
Github user huaxingao commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-178119526
  
@viirya @HyukjinKwon @rxin 
Thank you all very much for your comments.  I will change JDBCRelation to 
implement CatalystScan, and then  directly access Catalyst expressions in 
JDBCRDD. I will close this PR and submit a new one. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-26 Thread viirya
Github user viirya commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-175400704
  
I think most expressions (such as >, >=, <, <=, ==, string ops, arithmetic 
ops) which are commonly used in filters are relatively stable now. Maybe we can 
let JDBC datasource implement `CatalystScan` and process these expressions.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171497899
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/10750#discussion_r49681620
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ---
@@ -480,10 +480,120 @@ private[sql] object DataSourceStrategy extends 
Strategy with Logging {
   case expressions.Contains(a: Attribute, Literal(v: UTF8String, 
StringType)) =>
 Some(sources.StringContains(a.name, v.toString))
 
+  case expressions.BinaryComparison(BinaryArithmetic(left, right), 
Literal(v, t)) =>
+translateArithemiticOPFilter (predicate)
+  case expressions.BinaryComparison(Literal(v, t), 
BinaryArithmetic(left, right)) =>
+translateArithemiticOPFilter (predicate)
+
   case _ => None
 }
   }
 
+  private def translateArithemiticOPFilter(predicate: Expression): 
Option[Filter] = {
+predicate match {
+  case expressions.EqualTo(Add(left, right), Literal(v, t)) =>
+Some(sources.ArithmeticOPEqualTo(Add(left, right), 
convertToScala(v, t)))
--- End diff --

Hm.. I see. but for me it looks this might have to hide `expression._`. In 
this way, the `expression._` can be accessed in datasource level. I believe the 
reason way `source._` is implemented is, to hide `expression._` which has been 
changed rapidly version by version.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread rxin
Github user rxin commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171519191
  
Yes I think using the internal expression API makes more sense. We don't 
want to add too many expressions to the external data source API.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread viirya
Github user viirya commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171527022
  
Indeed, continuing to add more filters will be a problem. If we can 
directly pass Catalyst expressions to JDBC datasource, that will be better.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/10750#discussion_r49681409
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/sources/filters.scala ---
@@ -142,3 +144,43 @@ case class StringEndsWith(attribute: String, value: 
String) extends Filter
  * @since 1.3.1
  */
 case class StringContains(attribute: String, value: String) extends Filter
+
+/**
+ * A filter that evaluates to `true` iff the Arithmetic operation 
evaluates to a value
+ * equal to `value`.
+ *
+ * @since 2.0
--- End diff --

Should be 2.0.0.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/10750#discussion_r49681384
  
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCSuite.scala 
---
@@ -203,6 +203,7 @@ class JDBCSuite extends SparkFunSuite
 assert(stripSparkFilter(sql("SELECT * FROM foobar WHERE NAME LIKE 
'%re%'")).collect().size == 1)
 assert(stripSparkFilter(sql("SELECT * FROM nulltypes WHERE A IS 
NULL")).collect().size == 1)
 assert(stripSparkFilter(sql("SELECT * FROM nulltypes WHERE A IS NOT 
NULL")).collect().size == 0)
+assert(stripSparkFilter(sql("SELECT * FROM inttypes WHERE (A+C)*D-A = 
15")).collect().size == 1)
--- End diff --

Can you add more tests?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/10750#discussion_r49681352
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
 ---
@@ -223,11 +223,57 @@ private[sql] object JDBCRDD extends Logging {
 } else {
   null
 }
+  case ArithmeticOPEqualTo(operation, value) =>
+getArithmeticString(operation).get + s" =  ${compileValue(value)}"
--- End diff --

When `getArithmeticString` returns `None`, you will get 
`java.util.NoSuchElementException` here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread HyukjinKwon
Github user HyukjinKwon commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171513363
  
If we keep going to solve in `DataSourceStrategy` in this way, I think we 
should resolve the operators for other datasources. For this, dealing with 
`Cast` [SPARK-9182](https://issues.apache.org/jira/browse/SPARK-9182) might 
have to be done first. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread huaxingao
Github user huaxingao commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171498490
  
@viirya 
I changed the code based on your suggestion.  Could you please review 
again? 
Thanks a lot for your help!!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/10750#discussion_r49679753
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ---
@@ -480,10 +480,120 @@ private[sql] object DataSourceStrategy extends 
Strategy with Logging {
   case expressions.Contains(a: Attribute, Literal(v: UTF8String, 
StringType)) =>
 Some(sources.StringContains(a.name, v.toString))
 
+  case expressions.BinaryComparison(BinaryArithmetic(left, right), 
Literal(v, t)) =>
+translateArithemiticOPFilter (predicate)
+  case expressions.BinaryComparison(Literal(v, t), 
BinaryArithmetic(left, right)) =>
+translateArithemiticOPFilter (predicate)
+
   case _ => None
 }
   }
 
+  private def translateArithemiticOPFilter(predicate: Expression): 
Option[Filter] = {
+predicate match {
+  case expressions.EqualTo(Add(left, right), Literal(v, t)) =>
+Some(sources.ArithmeticOPEqualTo(Add(left, right), 
convertToScala(v, t)))
--- End diff --

As described in 
[SPARK-10195](https://issues.apache.org/jira/browse/SPARK-10195), it looks now 
data sources API exposes Catalyst's internal types through its Filter 
interfaces. I think this might have to be hidden.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread HyukjinKwon
Github user HyukjinKwon commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171513129
  
Since `source.Filter` is shared with Parquet, ORC and etc., I think this 
might have to resolve the arithmetic operators in `DataSourceStrategy` itself. 

AFAIK, Parquet and ORC does not support arithmetic operators and they might 
anyway have to convert in Spark-side in the future if we support this in this 
way. So, for this case, I think the operators might have to be resolved in here.

I believe we might better resolve this issue by modifying `CytalstScan` as 
suggested by @liancheng  in 
[SPARK-9182](https://issues.apache.org/jira/browse/SPARK-9182) and filed 
[SPARK-12126](https://issues.apache.org/jira/browse/SPARK-12126)
 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread HyukjinKwon
Github user HyukjinKwon commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171517766
  
@huaxingao please change the title not to end with …


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread huaxingao
GitHub user huaxingao opened a pull request:

https://github.com/apache/spark/pull/10750

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC …

…layer

For arithmetic operator in WHERE clause such as
select * from table where c1 + c2 > 10
Currently where c1 + c2 >10 is done at spark layer.
Will push this to JDBC layer so it will be done in database

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/huaxingao/spark spark__12506

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/10750.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #10750


commit 944bee6fe204e1cb12f3c5c57d6e09ad580905bd
Author: Huaxin Gao 
Date:   2016-01-13T14:04:20Z

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC layer




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/10750#discussion_r49680909
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ---
@@ -480,10 +480,120 @@ private[sql] object DataSourceStrategy extends 
Strategy with Logging {
   case expressions.Contains(a: Attribute, Literal(v: UTF8String, 
StringType)) =>
 Some(sources.StringContains(a.name, v.toString))
 
+  case expressions.BinaryComparison(BinaryArithmetic(left, right), 
Literal(v, t)) =>
+translateArithemiticOPFilter (predicate)
+  case expressions.BinaryComparison(Literal(v, t), 
BinaryArithmetic(left, right)) =>
+translateArithemiticOPFilter (predicate)
+
   case _ => None
 }
   }
 
+  private def translateArithemiticOPFilter(predicate: Expression): 
Option[Filter] = {
+predicate match {
+  case expressions.EqualTo(Add(left, right), Literal(v, t)) =>
+Some(sources.ArithmeticOPEqualTo(Add(left, right), 
convertToScala(v, t)))
--- End diff --

I took a look of SPARK-10195. Looks like it deals with the issue of 
exposing internal data types. It uses `convertToScala` to convert these 
internal data types to scala version. Since here `convertToScala` is used to 
convert the values. I think it should not be the same problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread HyukjinKwon
Github user HyukjinKwon commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171527726
  
@viirya Yes, I think so. But the reason why I did not give a try for this 
is, `expression._` is being rapidly changed, which could affect to update codes 
at the datasource implemented by `CatalystScan` version by version. I believe 
that this is also why Parquet datasource change its implementation of 
`CatalystScan`.

So, maybe we should try to find a better solution for this..




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread huaxingao
Github user huaxingao closed the pull request at:

https://github.com/apache/spark/pull/10505


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread huaxingao
Github user huaxingao commented on a diff in the pull request:

https://github.com/apache/spark/pull/10505#discussion_r49678184
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ---
@@ -485,6 +486,74 @@ private[sql] object DataSourceStrategy extends 
Strategy with Logging {
   }
 
   /**
+   * Convert add predicate such as C1 + C2 + C3 to string
+   */
+  private[this] def getAddString (predicate: Expression): String = {
+predicate match {
+  case expressions.Add(left, right) =>
+  {
+val leftString = left match {
+  case a: Attribute => a.name
+  case add: Add => getAddString (add)
+  case _ => None
+}
+val rightString = right match {
+  case a: Attribute => a.name
+  case add: Add => getAddString (add)
+  case _ => None
+}
+leftString + " + " + rightString
+  }
+}
+  }
+
+  /**
+   * Tries to translate a Catalyst [[Expression]] into data source 
[[Filter]].
--- End diff --

@viirya 
Thank you very much for your comments.  I changed the code based on your 
suggestion.  Since the new code is quite different from the original, I will 
close this PR and submit a new one. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-13 Thread HyukjinKwon
Github user HyukjinKwon commented on the pull request:

https://github.com/apache/spark/pull/10750#issuecomment-171515046
  
Actually, I also suggested the way similar with this in 
[SPARK-9182](https://issues.apache.org/jira/browse/SPARK-9182). If we keep 
adding filters in this way, this could end up converting all expressions.Filter 
to sources.Filter. This might mean we need to write a new expression library, 
which might not worth the effort.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-08 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/10505#discussion_r49172369
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ---
@@ -485,6 +486,74 @@ private[sql] object DataSourceStrategy extends 
Strategy with Logging {
   }
 
   /**
+   * Convert add predicate such as C1 + C2 + C3 to string
+   */
+  private[this] def getAddString (predicate: Expression): String = {
+predicate match {
+  case expressions.Add(left, right) =>
+  {
+val leftString = left match {
+  case a: Attribute => a.name
+  case add: Add => getAddString (add)
+  case _ => None
+}
+val rightString = right match {
+  case a: Attribute => a.name
+  case add: Add => getAddString (add)
+  case _ => None
+}
+leftString + " + " + rightString
+  }
+}
+  }
+
+  /**
+   * Tries to translate a Catalyst [[Expression]] into data source 
[[Filter]].
--- End diff --

This is heavily binding with JDBCRelation. Also for the getAddString 
method. The first parameter to `sources.EqualTo` (also for other filters) is 
attribute name. It is weird to have C1 + C2 + C3 passed in.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-08 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/10505#discussion_r49171837
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ---
@@ -500,11 +569,16 @@ private[sql] object DataSourceStrategy extends 
Strategy with Logging {
 // For conciseness, all Catalyst filter expressions of type 
`expressions.Expression` below are
 // called `predicate`s, while all data source filters of type 
`sources.Filter` are simply called
 // `filter`s.
+var jdbcRelation = false
--- End diff --

You can do it like:

val jdbcRelation = relation match {
  case jdbc: JDBCRelation => jdbcRelation = true
  case _ => jdbcRelation = false
}


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2016-01-04 Thread huaxingao
Github user huaxingao commented on the pull request:

https://github.com/apache/spark/pull/10505#issuecomment-168901512
  
@rxin 
I am not sure if my approach is OK.  Could you please take a quick look 
when you have time and let me know what you think?  Thank you very much for 
your help!!!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-29 Thread huaxingao
GitHub user huaxingao opened a pull request:

https://github.com/apache/spark/pull/10505

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC …

…layer

For arithmetic operator in WHERE clause such as
select * from table where c1 + c2 > 10
Currently where c1 + c2 >10 is done at spark layer.
Will push this to JDBC layer so it will be done in database

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/huaxingao/spark spark12506

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/10505.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #10505


commit b10310898953e8830898c0ac95c861c8f3c88fa5
Author: Huaxin Gao 
Date:   2015-12-27T20:45:52Z

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC layer




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-29 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/10505#issuecomment-167746565
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-29 Thread huaxingao
Github user huaxingao commented on the pull request:

https://github.com/apache/spark/pull/10505#issuecomment-167746195
  
I only added + operator for now.  If the change is accepted, I will also 
add -,* and /. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-29 Thread wilson888888888
Github user wilson8 closed the pull request at:

https://github.com/apache/spark/pull/10503


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-29 Thread wilson888888888
Github user wilson8 commented on the pull request:

https://github.com/apache/spark/pull/10503#issuecomment-167742432
  
Used the wrong ID.  Will close for now and open another one.  Sorry for the 
confusion 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-28 Thread wilson888888888
GitHub user wilson8 opened a pull request:

https://github.com/apache/spark/pull/10503

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC …

…layer
For arithmetic operator in WHERE clause such as
select * from table where c1 + c2 > 10
Currently where c1 + c2 >10 is done at spark layer.
Will push this to JDBC layer so it will be done in database.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/huaxingao/spark spark-12506

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/10503.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #10503


commit 09e2135d6e5abef21328941d73031109e6d4d4b6
Author: Huaxin Gao 
Date:   2015-12-27T19:58:12Z

[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC layer




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12506][SQL]push down WHERE clause arith...

2015-12-28 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/10503#issuecomment-167741641
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org