huaxingao opened a new pull request #33733:
URL: https://github.com/apache/spark/pull/33733


   
   
   ### What changes were proposed in this pull request?
   Add RowToColumnConverter for BinaryType
   
   
   ### Why are the changes needed?
   currently, we have RowToColumnConverter for all data types except BinaryType
   ```
     private def getConverterForType(dataType: DataType, nullable: Boolean): 
TypeConverter = {
       val core = dataType match {
         case BooleanType => BooleanConverter
         case ByteType => ByteConverter
         case ShortType => ShortConverter
         case IntegerType | DateType => IntConverter
         case FloatType => FloatConverter
         case LongType | TimestampType => LongConverter
         case DoubleType => DoubleConverter
         case StringType => StringConverter
         case CalendarIntervalType => CalendarConverter
         case at: ArrayType => 
ArrayConverter(getConverterForType(at.elementType, at.containsNull))
         case st: StructType => new StructConverter(st.fields.map(
           (f) => getConverterForType(f.dataType, f.nullable)))
         case dt: DecimalType => new DecimalConverter(dt)
         case mt: MapType => MapConverter(getConverterForType(mt.keyType, 
nullable = false),
           getConverterForType(mt.valueType, mt.valueContainsNull))
         case unknown => throw 
QueryExecutionErrors.unsupportedDataTypeError(unknown.toString)
       }
   ```
   so add one for BinaryType
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   modify existing test
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to