Peng-Lei opened a new pull request #34446:
URL: https://github.com/apache/spark/pull/34446


   ### What changes were proposed in this pull request?
   Add RowToColumnConverter for AnsiIntervalType
   
   ### Why are the changes needed?
   currently, we have RowToColumnConverter for all data types except 
AnsiIntervalType
   ```
     private def getConverterForType(dataType: DataType, nullable: Boolean): 
TypeConverter = {
       val core = dataType match {
         case BinaryType => BinaryConverter
         case BooleanType => BooleanConverter
         case ByteType => ByteConverter
         case ShortType => ShortConverter
         case IntegerType | DateType => IntConverter
         case FloatType => FloatConverter
         case LongType | TimestampType => LongConverter
         case DoubleType => DoubleConverter
         case StringType => StringConverter
         case CalendarIntervalType => CalendarConverter
         case at: ArrayType => 
ArrayConverter(getConverterForType(at.elementType, at.containsNull))
         case st: StructType => new StructConverter(st.fields.map(
           (f) => getConverterForType(f.dataType, f.nullable)))
         case dt: DecimalType => new DecimalConverter(dt)
         case mt: MapType => MapConverter(getConverterForType(mt.keyType, 
nullable = false),
           getConverterForType(mt.valueType, mt.valueContainsNull))
         case unknown => throw 
QueryExecutionErrors.unsupportedDataTypeError(unknown.toString)
       }
   
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   ### How was this patch tested?
   Add ut test


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to