Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/21603#discussion_r202418387
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala
---
@@ -376,7 +374,8 @@ class ParquetFileFormat
// Collects all converted Parquet filter predicates. Notice that
not all predicates can be
// converted (`ParquetFilters.createFilter` returns an
`Option`). That's why a `flatMap`
// is used here.
- .flatMap(new ParquetFilters(pushDownDate,
pushDownStringStartWith)
+ .flatMap(new ParquetFilters(pushDownDate,
pushDownStringStartWith,
--- End diff --
let us create `val parquetFilters = new ParquetFilters(pushDownDate,
pushDownStringStartWith, pushDownInFilterThreshold )`
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]