Github user wangyum commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21556#discussion_r200813391
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFilters.scala
 ---
    @@ -82,6 +120,30 @@ private[parquet] class ParquetFilters(pushDownDate: 
Boolean, pushDownStartWith:
           (n: String, v: Any) => FilterApi.eq(
             intColumn(n),
             Option(v).map(date => 
dateToDays(date.asInstanceOf[Date]).asInstanceOf[Integer]).orNull)
    +
    +    case ParquetSchemaType(DECIMAL, INT32, decimal) if pushDownDecimal =>
    --- End diff --
    
    Add check method to `canMakeFilterOn` and add a test case:
    ```scala
        val decimal = new JBigDecimal(10).setScale(scale)
        assert(decimal.scale() === scale)
        assertResult(Some(lt(intColumn("cdecimal1"), 1000: Integer))) {
          parquetFilters.createFilter(parquetSchema, 
sources.LessThan("cdecimal1", decimal))
        }
    
        val decimal1 = new JBigDecimal(10).setScale(scale + 1)
        assert(decimal1.scale() === scale + 1)
    
        assertResult(None) {
          parquetFilters.createFilter(parquetSchema, 
sources.LessThan("cdecimal1", decimal1))
        }
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to