cshuo commented on code in PR #13213:
URL: https://github.com/apache/hudi/pull/13213#discussion_r2057919083


##########
hudi-common/src/main/java/org/apache/hudi/common/engine/HoodieReaderContext.java:
##########
@@ -285,6 +277,14 @@ public abstract HoodieRecord<T> 
constructHoodieRecord(Option<T> recordOption,
    */
   public abstract T seal(T record);
 
+  /**
+   * Convert engine specific row into binary format.
+   *
+   * @param record The engine row
+   * @return row with binary format
+   */
+  public abstract T toBinaryRow(T record);

Review Comment:
   Besides Spark unsafe row conversion, Flink also needs convert 
`GenericRowData` info `BinaryRowData`. And binary conversion is needed not just 
in log reading, it's also necessary after merging in fg reader, where the 
merged row may be not binary format either, so currently it seems necessary to 
make `HoodieReaderContext` capable of performing binary conversion.
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to