danny0405 commented on code in PR #6740:
URL: https://github.com/apache/hudi/pull/6740#discussion_r980626021


##########
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/io/HoodieMergeHandle.java:
##########
@@ -210,18 +203,6 @@ private void init(String fileId, String partitionPath, 
HoodieBaseFile baseFileTo
       // Create the writer for writing the new version file
       fileWriter = createNewFileWriter(instantTime, newFilePath, hoodieTable, 
config,
         writeSchemaWithMetaFields, taskContextSupplier);
-
-      // init the cdc logger
-      this.cdcEnabled = 
config.getBooleanOrDefault(HoodieTableConfig.CDC_ENABLED);

Review Comment:
   Makes everything into one class is hard to extend for project with huge code 
bases. And i don't like your refactoring to the SparkRDDRelation and 
HoodieWriteClient based on the same reason. I have fixed so many 
bugs/degression after your refactoring in flink side, that makes me feel bad 
and hard to go on with this project.
   
   Do you think to do a per-record logic switching for non-cdc write path is 
reasonable ? Sorry i don't think so.
   
   So ignored.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to