[ https://issues.apache.org/jira/browse/SPARK-37161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Max Gekk resolved SPARK-37161. ------------------------------ Fix Version/s: 3.3.0 Resolution: Fixed Issue resolved by pull request 34446 [https://github.com/apache/spark/pull/34446] > RowToColumnConverter support AnsiIntervalType > ---------------------------------------------- > > Key: SPARK-37161 > URL: https://issues.apache.org/jira/browse/SPARK-37161 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.3.0 > Reporter: PengLei > Assignee: PengLei > Priority: Major > Fix For: 3.3.0 > > > currently, we have RowToColumnConverter for all data types except > AnsiIntervalType > {code:java} > // code placeholder > val core = dataType match { > case BinaryType => BinaryConverter > case BooleanType => BooleanConverter > case ByteType => ByteConverter > case ShortType => ShortConverter > case IntegerType | DateType => IntConverter > case FloatType => FloatConverter > case LongType | TimestampType => LongConverter > case DoubleType => DoubleConverter > case StringType => StringConverter > case CalendarIntervalType => CalendarConverter > case at: ArrayType => ArrayConverter(getConverterForType(at.elementType, > at.containsNull)) > case st: StructType => new StructConverter(st.fields.map( > (f) => getConverterForType(f.dataType, f.nullable))) > case dt: DecimalType => new DecimalConverter(dt) > case mt: MapType => MapConverter(getConverterForType(mt.keyType, nullable = > false), > getConverterForType(mt.valueType, mt.valueContainsNull)) > case unknown => throw > QueryExecutionErrors.unsupportedDataTypeError(unknown.toString) > } > if (nullable) { > dataType match { > case CalendarIntervalType => new StructNullableTypeConverter(core) > case st: StructType => new StructNullableTypeConverter(core) > case _ => new BasicNullableTypeConverter(core) > } > } else { > core > } > {code} > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org