Github user gengliangwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21667#discussion_r199079744
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/FileFormat.scala
 ---
    @@ -152,6 +152,16 @@ trait FileFormat {
         }
       }
     
    +  /**
    +   * Returns whether this format supports the given [[DataType]] in 
read/write path.
    +   *
    +   * By default all data types are supported except 
[[CalendarIntervalType]] in write path.
    +   */
    +  def supportDataType(dataType: DataType, isReadPath: Boolean): Boolean = 
dataType match {
    --- End diff --
    
    Blacklist is easier.
    With whitelist , we will have to validate 
    ``` 
    BooleanType | ByteType | ShortType | IntegerType | LongType | FloatType | 
DoubleType |
              StringType | BinaryType | DateType | TimestampType | DecimalType
    ```
    Of course we can have a default function to process these. But if we add a 
new data source which didn't support all of them, the implementation will be 
verbose.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to