attilapiros commented on a change in pull request #31133:
URL: https://github.com/apache/spark/pull/31133#discussion_r570565602



##########
File path: sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
##########
@@ -248,11 +249,16 @@ class HadoopTableReader(
         // SPARK-13709: For SerDes like AvroSerDe, some essential information 
(e.g. Avro schema
         // information) may be defined in table properties. Here we should 
merge table properties
         // and partition properties before initializing the deserializer. Note 
that partition
-        // properties take a higher priority here. For example, a partition 
may have a different
-        // SerDe as the one defined in table properties.
+        // properties take a higher priority here except for the Avro table 
properties
+        // to support schema evolution: in that case the properties given at 
table level will
+        // be used (for details please check SPARK-26836).
+        // For example, a partition may have a different SerDe as the one 
defined in table
+        // properties.
         val props = new Properties(tableProperties)
-        partProps.asScala.foreach {
-          case (key, value) => props.setProperty(key, value)
+        partProps.asScala.filterNot { case (k, _) =>
+          k == AvroTableProperties.SCHEMA_LITERAL.getPropName() && 
tableProperties.containsKey(k)
+        }.foreach { case (key, value) =>
+          props.setProperty(key, value)

Review comment:
       I have chosen the consistent over keeping the old style. Or should I 
modify the `filterNot` body too?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to