nsivabalan commented on code in PR #5427:
URL: https://github.com/apache/hudi/pull/5427#discussion_r858134674


##########
hudi-client/hudi-spark-client/src/main/scala/org/apache/hudi/HoodieSparkUtils.scala:
##########
@@ -324,7 +326,14 @@ object HoodieSparkUtils extends SparkAdapterSupport {
       val name2Fields = tableAvroSchema.getFields.asScala.map(f => f.name() -> 
f).toMap
       // Here have to create a new Schema.Field object
       // to prevent throwing exceptions like 
"org.apache.avro.AvroRuntimeException: Field already used".
-      val requiredFields = requiredColumns.map(c => name2Fields(c))
+      val requiredFields = requiredColumns.filter(c => {

Review Comment:
   alexey: I am not very sure on the amount of changes required for the 
proposal you have made. but lets try to make minimal changes to make progress 
w/o requiring more testing. anyways, we will revisit the preCombine field 
setting altogether for 0.12 and put some fixes. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to