Github user yucai commented on a diff in the pull request:
https://github.com/apache/spark/pull/20851#discussion_r175293026
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFilterSuite.scala
---
@@ -313,6 +314,36 @@ class ParquetFilterSuite extends QueryTest with
ParquetTest with SharedSQLContex
}
}
+ test("filter pushdown - date") {
+ implicit class IntToDate(int: Int) {
+ def d: Date = new Date(Date.valueOf("2018-03-01").getTime + 24 * 60
* 60 * 1000 * (int - 1))
+ }
+
+ withParquetDataFrame((1 to 4).map(i => Tuple1(i.d))) { implicit df =>
--- End diff --
Could you kindly give me some examples about what kind of boundary tests? I
checked parquet integer push down and ORC date type push down, seems like have
covered all their tests.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]