cashmand commented on code in PR #45703:
URL: https://github.com/apache/spark/pull/45703#discussion_r1537802607
##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala:
##########
@@ -179,9 +179,21 @@ class ParquetToSparkSchemaConverter(
field match {
case primitiveColumn: PrimitiveColumnIO =>
convertPrimitiveField(primitiveColumn, targetType)
case groupColumn: GroupColumnIO if targetType.contains(VariantType) =>
+ if (groupColumn.getChildrenCount != 2 ) {
+ // We may allow more than two children in the future, so consider
this unsupported.
+ throw QueryCompilationErrors.
+ parquetTypeUnsupportedYetError("variant with more than two fields")
+ }
+ val valueIdx = (0 until groupColumn.getChildrenCount)
+ .find(groupColumn.getChild(_).getName == "value")
+ val metadataIdx = (0 until groupColumn.getChildrenCount)
+ .find(groupColumn.getChild(_).getName == "metadata")
Review Comment:
I don't have a strong opinion. In the test, I specifically made a test case
that allows either order. I can't think of a strong reason to allow or forbid
arbitrary ordering.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]