yihua commented on a change in pull request #5004:
URL: https://github.com/apache/hudi/pull/5004#discussion_r832514445



##########
File path: 
hudi-client/hudi-spark-client/src/test/java/org/apache/hudi/testutils/HoodieClientTestUtils.java
##########
@@ -241,9 +243,9 @@ public static long 
countRecordsOptionallySince(JavaSparkContext jsc, String base
     Schema schema = null;
     for (String path : paths) {
       try {
-        HFile.Reader reader = HFile.createReader(fs, new Path(path), 
cacheConfig, fs.getConf());
+        HFile.Reader reader = HFile.createReader(fs, new Path(path), 
cacheConfig, true, fs.getConf());
         if (schema == null) {
-          schema = new Schema.Parser().parse(new 
String(reader.loadFileInfo().get("schema".getBytes())));
+          schema = new Schema.Parser().parse(new 
String(reader.getHFileInfo().get(KEY_SCHEMA.getBytes())));
         }
         HFileScanner scanner = reader.getScanner(false, false);

Review comment:
       This class is for the test only.  Regarding the production code, 
HoodieHFileReader properly close the HFile reader:
   ```
   @Override
     public synchronized void close() {
       try {
         reader.close();
         reader = null;
         if (fsDataInputStream != null) {
           fsDataInputStream.close();
         }
         keyScanner = null;
       } catch (IOException e) {
         throw new HoodieIOException("Error closing the hfile reader", e);
       }
     }
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to