cashmand commented on code in PR #49234:
URL: https://github.com/apache/spark/pull/49234#discussion_r1899610920
##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetUtils.scala:
##########
@@ -420,6 +420,22 @@ object ParquetUtils extends Logging {
statistics.getNumNulls;
}
+ // Replaces each VariantType in the schema with the corresponding type in
the shredding schema.
+ // Used for testing, where we force a single shredding schema for all
Variant fields.
+ // Does not touch Variant fields nested in arrays, maps, or UDTs.
+ private def replaceVariantTypes(schema: StructType, shreddingSchema:
StructType): StructType = {
+ val newFields = schema.fields.zip(shreddingSchema.fields).map {
+ case (field, shreddingField) =>
+ field.dataType match {
+ case s: StructType =>
+ field.copy(dataType = replaceVariantTypes(s, shreddingSchema))
Review Comment:
Hi @Zouxxyy, I tend to agree with @cloud-fan's comment. My preference would
be to stick with the current approach for now. I think it should be entirely
contained within the Parquet writer code, so if we decide to extend the
VariantType later, I don't think it would be hard to change this part of the
code to use a different approach.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]