usberkeley commented on code in PR #11924:
URL: https://github.com/apache/hudi/pull/11924#discussion_r1882202947


##########
hudi-common/src/main/java/org/apache/hudi/common/table/log/block/HoodieAvroDataBlock.java:
##########
@@ -149,6 +151,29 @@ protected <T> ClosableIterator<HoodieRecord<T>> 
deserializeRecords(byte[] conten
     return new CloseableMappingIterator<>(iterator, data -> (HoodieRecord<T>) 
new HoodieAvroIndexedRecord(data));
   }
 
+  /**
+   * Streaming deserialization of records.
+   *
+   * @param inputStream The input stream from which to read the records.
+   * @param contentLocation The location within the input stream where the 
content starts.
+   * @param bufferSize The size of the buffer to use for reading the records.
+   * @return A ClosableIterator over HoodieRecord<T>.
+   * @throws IOException If there is an error reading or deserializing the 
records.
+   */
+  @Override
+  protected <T> ClosableIterator<HoodieRecord<T>> deserializeRecords(
+          SeekableDataInputStream inputStream,
+          HoodieLogBlockContentLocation contentLocation,
+          HoodieRecordType type,
+          int bufferSize
+  ) throws IOException {
+    checkState(this.readerSchema != null, "Reader's schema has to be 
non-null");
+    checkArgument(type != HoodieRecordType.SPARK, "Not support read avro to 
spark record");
+    // TODO AvroSparkReader need

Review Comment:
   > Remove these checks and clean the TODOs.
   
   Got it, Thanks~



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to