yihua commented on code in PR #13726:
URL: https://github.com/apache/hudi/pull/13726#discussion_r2294927502


##########
hudi-spark-datasource/hudi-spark/src/test/java/org/apache/hudi/client/functional/TestDataValidationCheckForLogCompactionActions.java:
##########
@@ -321,16 +321,16 @@ public TestTableContents(String basePath, String 
tableName, HoodieTableMetaClien
     }
 
     private void updatePreviousGeneration(List<HoodieRecord> generatedRecords, 
String commitTimeOnMainTable, int previousActionType) {
-      Schema schema = new Schema.Parser().parse(this.config.getSchema());
-      this.generatedRecords = generatedRecords.stream().map(rec -> 
deepCopyAndModifyRecordKey(rec)).collect(Collectors.toList());
+      this.generatedRecords = 
generatedRecords.stream().map(this::deepCopyAndModifyRecordKey).collect(Collectors.toList());
       this.commitTimeOnMainTable = commitTimeOnMainTable;
       this.previousActionType = previousActionType;
     }
 
     private HoodieRecord deepCopyAndModifyRecordKey(HoodieRecord record) {
       HoodieKey key = deepCopyAndModifyRecordKey(record.getKey());
-      RawTripTestPayload payload = 
((RawTripTestPayload)record.getData()).clone();
-      return new HoodieAvroRecord(key, payload);
+      GenericRecord data = (GenericRecord) record.getData();
+      GenericRecord copiedData = GenericData.get().deepCopy(data.getSchema(), 
data);
+      return new HoodieAvroIndexedRecord(key, copiedData);

Review Comment:
   Now that we have migrated the tests from using `HoodieAvroRecord` (payload 
class-based) to `HoodieAvroIndexedRecord`, I assume there are still functional 
tests on Spark Datasource and Hudi streamer with custom payload and merger 
implementation, so the test coverage is maintained?



##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/hudi/common/model/TestHoodieRecordSerialization.scala:
##########
@@ -81,8 +81,8 @@ class TestHoodieRecordSerialization extends 
SparkClientFunctionalTestHarness {
     val hoodieInternalRow = new HoodieInternalRow(new Array[UTF8String](5), 
unsafeRow, false)
 
     Seq(
-      (unsafeRow, rowSchema, 91),
-      (hoodieInternalRow, addMetaFields(rowSchema), 131)
+      (unsafeRow, rowSchema, 92),

Review Comment:
   Is this because of the `metaData` during serde?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to