[GitHub] spark pull request #22535: [SPARK-17636][SQL][WIP] Parquet predicate pushdow...

2018-09-24 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/22535#discussion_r219948634
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ---
@@ -437,53 +436,63 @@ object DataSourceStrategy {
* @return a `Some[Filter]` if the input [[Expression]] is convertible, 
otherwise a `None`.
*/
   protected[sql] def translateFilter(predicate: Expression): 
Option[Filter] = {
+// Recursively try to find an attribute name in top level or struct 
that can be pushed down.
+def attrName(e: Expression): Option[String] = e match {
+  // In Spark and many data sources such as parquet, dots are used as 
a column path delimiter;
+  // thus, we don't push down such filters.
+  case a: Attribute if !a.name.contains(".") =>
+Some(a.name)
+  case s: GetStructField if 
!s.childSchema(s.ordinal).name.contains(".") =>
+attrName(s.child).map(_ + s".${s.childSchema(s.ordinal).name}")
+  case _ =>
+None
+}
+
 predicate match {
-  case expressions.EqualTo(a: Attribute, Literal(v, t)) =>
-Some(sources.EqualTo(a.name, convertToScala(v, t)))
-  case expressions.EqualTo(Literal(v, t), a: Attribute) =>
-Some(sources.EqualTo(a.name, convertToScala(v, t)))
-
-  case expressions.EqualNullSafe(a: Attribute, Literal(v, t)) =>
-Some(sources.EqualNullSafe(a.name, convertToScala(v, t)))
-  case expressions.EqualNullSafe(Literal(v, t), a: Attribute) =>
-Some(sources.EqualNullSafe(a.name, convertToScala(v, t)))
-
-  case expressions.GreaterThan(a: Attribute, Literal(v, t)) =>
-Some(sources.GreaterThan(a.name, convertToScala(v, t)))
-  case expressions.GreaterThan(Literal(v, t), a: Attribute) =>
-Some(sources.LessThan(a.name, convertToScala(v, t)))
-
-  case expressions.LessThan(a: Attribute, Literal(v, t)) =>
-Some(sources.LessThan(a.name, convertToScala(v, t)))
-  case expressions.LessThan(Literal(v, t), a: Attribute) =>
-Some(sources.GreaterThan(a.name, convertToScala(v, t)))
-
-  case expressions.GreaterThanOrEqual(a: Attribute, Literal(v, t)) =>
-Some(sources.GreaterThanOrEqual(a.name, convertToScala(v, t)))
-  case expressions.GreaterThanOrEqual(Literal(v, t), a: Attribute) =>
-Some(sources.LessThanOrEqual(a.name, convertToScala(v, t)))
-
-  case expressions.LessThanOrEqual(a: Attribute, Literal(v, t)) =>
-Some(sources.LessThanOrEqual(a.name, convertToScala(v, t)))
-  case expressions.LessThanOrEqual(Literal(v, t), a: Attribute) =>
-Some(sources.GreaterThanOrEqual(a.name, convertToScala(v, t)))
-
-  case expressions.InSet(a: Attribute, set) =>
-val toScala = 
CatalystTypeConverters.createToScalaConverter(a.dataType)
-Some(sources.In(a.name, set.toArray.map(toScala)))
+  case expressions.EqualTo(e: Expression, Literal(v, t)) =>
--- End diff --

This PR will be a good performance improvement for Spark 2.5.0.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22535: [SPARK-17636][SQL][WIP] Parquet predicate pushdow...

2018-09-24 Thread dbtsai
GitHub user dbtsai opened a pull request:

https://github.com/apache/spark/pull/22535

[SPARK-17636][SQL][WIP] Parquet predicate pushdown in nested fields

## What changes were proposed in this pull request?

Support Parquet predicate pushdown in nested fields

## How was this patch tested?

Existing tests and new tests are added.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/dbtsai/spark parquetNesting

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/22535.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #22535


commit c95706f60e4d576caca78a32000d4a7bbb12c141
Author: DB Tsai 
Date:   2018-09-06T00:22:09Z

Nested parquet pushdown




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org