wangxianghu commented on a change in pull request #1827:
URL: https://github.com/apache/hudi/pull/1827#discussion_r485026581



##########
File path: 
hudi-client/hudi-spark-client/src/main/java/org/apache/hudi/io/HoodieSparkMergeHandle.java
##########
@@ -54,9 +60,9 @@
 import java.util.Set;
 
 @SuppressWarnings("Duplicates")
-public class HoodieMergeHandle<T extends HoodieRecordPayload> extends 
HoodieWriteHandle<T> {
+public class HoodieSparkMergeHandle<T extends HoodieRecordPayload> extends 
HoodieWriteHandle<T, JavaRDD<HoodieRecord<T>>, JavaRDD<HoodieKey>, 
JavaRDD<WriteStatus>, JavaPairRDD<HoodieKey, Option<Pair<String, String>>>> {

Review comment:
       > at the MergeHandle level, we need not introduce any notion of RDDs. 
the `io` package should be free of spark already. All we need to do is to pass 
in the taskContextSupplier correctly? This is a large outstanding issue we need 
to resolve
   
   Actually not yet. https://github.com/apache/hudi/pull/1756 added support for 
rollbacks using marker files, and `MarkerFiles` is spark related.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to