alexeykudinkin commented on code in PR #7915:
URL: https://github.com/apache/hudi/pull/7915#discussion_r1105007223


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/spark/sql/avro/SchemaConverters.scala:
##########
@@ -202,6 +202,13 @@ private[sql] object SchemaConverters {
           st.foreach { f =>
             val fieldAvroType =
               toAvroType(f.dataType, f.nullable, f.name, childNameSpace)
+            val fieldBuilder = 
fieldsAssembler.name(f.name).`type`(fieldAvroType)

Review Comment:
   I don't think i understand why you believe this is an appropriate fix for 
the issue you're observing: 
   
    - Spark's schemas don't have defaults at all
    - In case Avro schema's field is nullable doesn't entail that it should 
have null as default value



##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestUpdateTable.scala:
##########
@@ -204,4 +204,48 @@ class TestUpdateTable extends HoodieSparkSqlTestBase {
       }
     })
   }
+
+  test("Test Add Column and Update Table") {
+    withTempDir { tmp =>

Review Comment:
   I see you pasted the stacktrace failing when you query your data via server. 
Can you please paste the stacktrace of this particular test failing? 
   
   I want to better understand which operation is failing in this test



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to