Github user huaxingao closed the pull request at:
https://github.com/apache/spark/pull/10750
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user huaxingao commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-178119526
@viirya @HyukjinKwon @rxin
Thank you all very much for your comments. I will change JDBCRelation to
implement CatalystScan, and then directly access Catalyst
Github user viirya commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-175400704
I think most expressions (such as >, >=, <, <=, ==, string ops, arithmetic
ops) which are commonly used in filters are relatively stable now. Maybe we can
let JDBC
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171497899
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/10750#discussion_r49681620
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
---
@@ -480,10 +480,120 @@ private[sql] object
Github user rxin commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171519191
Yes I think using the internal expression API makes more sense. We don't
want to add too many expressions to the external data source API.
---
If your project is set up
Github user viirya commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171527022
Indeed, continuing to add more filters will be a problem. If we can
directly pass Catalyst expressions to JDBC datasource, that will be better.
---
If your project is
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/10750#discussion_r49681409
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/sources/filters.scala ---
@@ -142,3 +144,43 @@ case class StringEndsWith(attribute: String, value:
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/10750#discussion_r49681384
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCSuite.scala
---
@@ -203,6 +203,7 @@ class JDBCSuite extends SparkFunSuite
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/10750#discussion_r49681352
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCRDD.scala
---
@@ -223,11 +223,57 @@ private[sql] object JDBCRDD
Github user HyukjinKwon commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171513363
If we keep going to solve in `DataSourceStrategy` in this way, I think we
should resolve the operators for other datasources. For this, dealing with
`Cast`
Github user huaxingao commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171498490
@viirya
I changed the code based on your suggestion. Could you please review
again?
Thanks a lot for your help!!
---
If your project is set up for it,
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/10750#discussion_r49679753
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
---
@@ -480,10 +480,120 @@ private[sql] object
Github user HyukjinKwon commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171513129
Since `source.Filter` is shared with Parquet, ORC and etc., I think this
might have to resolve the arithmetic operators in `DataSourceStrategy` itself.
Github user HyukjinKwon commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171517766
@huaxingao please change the title not to end with â¦
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
GitHub user huaxingao opened a pull request:
https://github.com/apache/spark/pull/10750
[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC â¦
â¦layer
For arithmetic operator in WHERE clause such as
select * from table where c1 + c2 > 10
Currently
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/10750#discussion_r49680909
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
---
@@ -480,10 +480,120 @@ private[sql] object
Github user HyukjinKwon commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171527726
@viirya Yes, I think so. But the reason why I did not give a try for this
is, `expression._` is being rapidly changed, which could affect to update codes
at the
Github user huaxingao closed the pull request at:
https://github.com/apache/spark/pull/10505
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user huaxingao commented on a diff in the pull request:
https://github.com/apache/spark/pull/10505#discussion_r49678184
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
---
@@ -485,6 +486,74 @@ private[sql] object
Github user HyukjinKwon commented on the pull request:
https://github.com/apache/spark/pull/10750#issuecomment-171515046
Actually, I also suggested the way similar with this in
[SPARK-9182](https://issues.apache.org/jira/browse/SPARK-9182). If we keep
adding filters in this way, this
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/10505#discussion_r49172369
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
---
@@ -485,6 +486,74 @@ private[sql] object
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/10505#discussion_r49171837
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
---
@@ -500,11 +569,16 @@ private[sql] object
Github user huaxingao commented on the pull request:
https://github.com/apache/spark/pull/10505#issuecomment-168901512
@rxin
I am not sure if my approach is OK. Could you please take a quick look
when you have time and let me know what you think? Thank you very much for
your
GitHub user huaxingao opened a pull request:
https://github.com/apache/spark/pull/10505
[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC â¦
â¦layer
For arithmetic operator in WHERE clause such as
select * from table where c1 + c2 > 10
Currently
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10505#issuecomment-167746565
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user huaxingao commented on the pull request:
https://github.com/apache/spark/pull/10505#issuecomment-167746195
I only added + operator for now. If the change is accepted, I will also
add -,* and /.
---
If your project is set up for it, you can reply to this email and have
Github user wilson8 closed the pull request at:
https://github.com/apache/spark/pull/10503
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
Github user wilson8 commented on the pull request:
https://github.com/apache/spark/pull/10503#issuecomment-167742432
Used the wrong ID. Will close for now and open another one. Sorry for the
confusion
---
If your project is set up for it, you can reply to this email and
GitHub user wilson8 opened a pull request:
https://github.com/apache/spark/pull/10503
[SPARK-12506][SQL]push down WHERE clause arithmetic operator to JDBC â¦
â¦layer
For arithmetic operator in WHERE clause such as
select * from table where c1 + c2 > 10
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10503#issuecomment-167741641
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
31 matches
Mail list logo