xushiyan commented on code in PR #6279:
URL: https://github.com/apache/hudi/pull/6279#discussion_r937001645
##########
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/io/HoodieAppendHandle.java:
##########
@@ -471,10 +471,12 @@ private Writer createLogWriter(Option<FileSlice>
fileSlice, String baseCommitTim
return HoodieLogFormat.newWriterBuilder()
.onParentPath(FSUtils.getPartitionPath(hoodieTable.getMetaClient().getBasePath(),
partitionPath))
- .withFileId(fileId).overBaseCommit(baseCommitTime)
+ .withFileId(fileId)
+ .overBaseCommit(baseCommitTime)
.withLogVersion(latestLogFile.map(HoodieLogFile::getLogVersion).orElse(HoodieLogFile.LOGFILE_BASE_VERSION))
.withFileSize(latestLogFile.map(HoodieLogFile::getFileSize).orElse(0L))
- .withSizeThreshold(config.getLogFileMaxSize()).withFs(fs)
+ .withSizeThreshold(config.getLogFileMaxSize())
+ .withFs(fs)
Review Comment:
should not include unrelated style changes
##########
hudi-spark-datasource/hudi-spark3.3.x/src/main/scala/org/apache/spark/sql/avro/AvroSerializer.scala:
##########
@@ -292,6 +309,60 @@ private[sql] class AvroSerializer(
result
}
+
////////////////////////////////////////////////////////////////////////////////////////////
+ // Following section is amended to the original (Spark's) implementation
+ // >>> BEGINS
+
////////////////////////////////////////////////////////////////////////////////////////////
+
+ private def newUnionConverter(catalystStruct: StructType,
Review Comment:
are you referring to AvroSerializer being used throughout the spark tests?
but i don't see a specific UT for this union converter logic
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]