xiarixiaoyao commented on code in PR #8026:
URL: https://github.com/apache/hudi/pull/8026#discussion_r1115539986


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieBaseRelation.scala:
##########
@@ -613,8 +617,11 @@ object HoodieBaseRelation extends SparkAdapterSupport {
     def apply(file: PartitionedFile): Iterator[InternalRow] = read.apply(file)
   }
 
-  def convertToAvroSchema(structSchema: StructType): Schema =
-    sparkAdapter.getAvroSchemaConverters.toAvroType(structSchema, nullable = 
false, "Record")
+  def convertToAvroSchema(structSchema: StructType, tableName: String ): 
Schema = {
+    val (recordName, namespace) = 
AvroConversionUtils.getAvroRecordNameAndNamespace(tableName)
+    val avroSchema = 
sparkAdapter.getAvroSchemaConverters.toAvroType(structSchema, nullable = false, 
recordName, namespace)

Review Comment:
   avro schema is sensitive to namespace, we must keep the namespace and name 
before we convert structtype to avro type.
   eg: now we create table will  col1: String, col2: String, ff decimal(38, 10)
   ff type will be: 
{"name":"ff","type":[{"type":"fixed","name":"fixed","namespace":"hoodie.h0.h0_record.ff","size":16,"logicalType":"decimal","precision":38,"scale":10}
   
   use the original logic conversion
   ff type is : 
"name":"ff","type":[{"type":"fixed","name":"fixed","namespace":"Record.ff","size":16,"logicalType":"decimal","precision":38,"scale":10},"null"]}
   
   The above two schemas are not compatible, causing Spark to be unable to read 
the log
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to