Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19702#discussion_r150381847
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala ---
    @@ -281,4 +281,32 @@ class SQLConfSuite extends QueryTest with 
SharedSQLContext {
         assert(null == spark.conf.get("spark.sql.nonexistent", null))
         assert("<undefined>" == spark.conf.get("spark.sql.nonexistent", 
"<undefined>"))
       }
    +
    +  test("SPARK-10365: PARQUET_OUTPUT_TIMESTAMP_TYPE") {
    +    spark.sessionState.conf.clear()
    +
    +    // check default value
    +    assert(spark.sessionState.conf.parquetOutputTimestampType ==
    +      SQLConf.ParquetOutputTimestampType.INT96)
    +
    +    // PARQUET_INT64_AS_TIMESTAMP_MILLIS should be respected.
    +    
spark.sessionState.conf.setConf(SQLConf.PARQUET_INT64_AS_TIMESTAMP_MILLIS, true)
    +    assert(spark.sessionState.conf.parquetOutputTimestampType ==
    +      SQLConf.ParquetOutputTimestampType.TIMESTAMP_MILLIS)
    +
    +    // PARQUET_OUTPUT_TIMESTAMP_TYPE has higher priority over 
PARQUET_INT64_AS_TIMESTAMP_MILLIS
    +    spark.sessionState.conf.setConf(SQLConf.PARQUET_OUTPUT_TIMESTAMP_TYPE, 
"timestamp_micros")
    +    assert(spark.sessionState.conf.parquetOutputTimestampType ==
    +      SQLConf.ParquetOutputTimestampType.TIMESTAMP_MICROS)
    +    spark.sessionState.conf.setConf(SQLConf.PARQUET_OUTPUT_TIMESTAMP_TYPE, 
"int96")
    +    assert(spark.sessionState.conf.parquetOutputTimestampType ==
    +      SQLConf.ParquetOutputTimestampType.INT96)
    +
    +    // test invalid conf value
    +    intercept[IllegalArgumentException] {
    +      spark.conf.set(SQLConf.PARQUET_OUTPUT_TIMESTAMP_TYPE.key, "invalid")
    +    }
    +
    +    spark.sessionState.conf.clear()
    --- End diff --
    
    This is the existing code style in this file, I think we can put it in 
`beforeEach` and `afterEach`, I'll investigate in follow-up.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to