Copilot commented on code in PR #5724:
URL: https://github.com/apache/hive/pull/5724#discussion_r2272599223


##########
ql/src/java/org/apache/hadoop/hive/ql/metadata/HiveStorageHandler.java:
##########
@@ -182,16 +182,14 @@ public interface HiveStorageHandler extends Configurable {
   void configureOutputJobProperties(TableDesc tableDesc, Map<String, String> 
jobProperties);
 
   /**
-   * Deprecated use configureInputJobProperties/configureOutputJobProperties
-   * methods instead.
-   *
    * Configures properties for a job based on the definition of the
    * source or target table it accesses.
    *
    * @param tableDesc descriptor for the table being accessed
+   * @param jobProperties receives properties copied or transformed from the 
table properties
    *
-   * @param jobProperties receives properties copied or transformed
-   * from the table properties
+   * @deprecated since 4.0.1, will be removed in 5.0.0,
+   * use {@link #configureInputJobProperties,#configureOutputJobProperties} 
methods instead.

Review Comment:
   Missing space after comma in @link tag. Should be '{@link 
#configureInputJobProperties, #configureOutputJobProperties}'.
   ```suggestion
      * use {@link #configureInputJobProperties, #configureOutputJobProperties} 
methods instead.
   ```



##########
iceberg/iceberg-handler/src/main/java/org/apache/iceberg/mr/hive/IcebergTableUtil.java:
##########
@@ -590,6 +600,33 @@ public static TransformSpec getTransformSpec(Table table, 
String transformName,
     return spec;
   }
 
+  public static <T> List<T> readColStats(Table table, Long snapshotId, 
Predicate<BlobMetadata> filter) {
+    List<T> colStats = Lists.newArrayList();
+
+    Path statsPath  = IcebergTableUtil.getColStatsPath(table, snapshotId);
+    if (statsPath == null) {
+      return colStats;
+    }
+    try (PuffinReader reader = 
Puffin.read(table.io().newInputFile(statsPath.toString())).build()) {
+      List<BlobMetadata> blobMetadata = reader.fileMetadata().blobs();
+
+      if (filter != null) {
+        blobMetadata = blobMetadata.stream().filter(filter)
+          .toList();

Review Comment:
   The call to '.toList()' on line 616 requires Java 16+. Consider using 
'.collect(Collectors.toList())' for broader compatibility.
   ```suggestion
             .collect(Collectors.toList());
   ```



##########
iceberg/iceberg-handler/src/main/java/org/apache/iceberg/mr/hive/IcebergTableUtil.java:
##########
@@ -608,7 +645,7 @@ public static boolean hasUndergonePartitionEvolution(Table 
table) {
             .anyMatch(id -> id != table.spec().specId());
   }
 
-  public static <T extends ContentFile> Set<String> getPartitionNames(Table 
icebergTable, Iterable<T> files,
+  public static <T extends ContentFile<?>> Set<String> getPartitionNames(Table 
icebergTable, Iterable<T> files,

Review Comment:
   The generic type bound has been changed from 'ContentFile' to 
'ContentFile<?>' which may be a breaking change for consumers who rely on the 
raw type.
   ```suggestion
     public static <T extends ContentFile> Set<String> getPartitionNames(Table 
icebergTable, Iterable<T> files,
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: gitbox-unsubscr...@hive.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: gitbox-unsubscr...@hive.apache.org
For additional commands, e-mail: gitbox-h...@hive.apache.org

Reply via email to