the-other-tim-brown commented on code in PR #17573:
URL: https://github.com/apache/hudi/pull/17573#discussion_r2616542014


##########
hudi-common/src/main/java/org/apache/hudi/common/schema/HoodieSchemaUtils.java:
##########
@@ -573,6 +573,42 @@ public static HoodieSchema getRecordKeySchema() {
     return RECORD_KEY_SCHEMA;
   }
 
+  /**
+   * Converts field values for specific data types with logical type handling.
+   * This is equivalent to HoodieAvroUtils.convertValueForSpecificDataTypes() 
but operates on HoodieSchema.
+   * <p>
+   * Handles special conversions for Avro logical types:
+   * <ul>
+   *   <li>Date type - converts epoch day integer to LocalDate</li>
+   *   <li>Timestamp types - converts epoch milliseconds/microseconds to 
Timestamp</li>
+   *   <li>Decimal type - converts bytes/fixed to BigDecimal</li>
+   * </ul>
+   *
+   * @param fieldSchema the field schema
+   * @param fieldValue the field value to convert
+   * @param consistentLogicalTimestampEnabled whether to use consistent 
logical timestamp handling
+   * @return converted value for logical types, or original value
+   * @throws IllegalStateException if fieldValue is null but schema is not 
nullable
+   * @since 1.2.0
+   */
+  public static Object convertValueForSpecificDataTypes(HoodieSchema 
fieldSchema,

Review Comment:
   Make sure there are unit tests for this



##########
hudi-common/src/main/java/org/apache/hudi/common/schema/HoodieSchemaUtils.java:
##########
@@ -582,6 +618,7 @@ public static HoodieSchema getRecordKeySchema() {
    * @param fields     list of fields to project
    * @return projected schema containing only specified fields
    */
+

Review Comment:
   remove this extra newline



##########
hudi-client/hudi-spark-client/src/main/scala/org/apache/hudi/HoodieSchemaConversionUtils.scala:
##########
@@ -194,4 +198,19 @@ object HoodieSchemaConversionUtils {
       case _ => dataType
     }
   }
+
+  /**
+   * Creates a converter from GenericRecord to InternalRow using HoodieSchema.
+   * This is equivalent to 
AvroConversionUtils.createAvroToInternalRowConverter() but accepts HoodieSchema.
+   *
+   * @param requiredAvroSchema the HoodieSchema to use for deserialization

Review Comment:
   Update the naming of this?



##########
hudi-client/hudi-spark-client/src/main/scala/org/apache/hudi/AvroConversionUtils.scala:
##########
@@ -50,7 +51,7 @@ object AvroConversionUtils {
    * @return converter accepting Avro payload and transforming it into a 
Catalyst one (in the form of [[InternalRow]])
    */
   def createAvroToInternalRowConverter(rootAvroType: Schema, rootCatalystType: 
StructType): GenericRecord => Option[InternalRow] = {

Review Comment:
   Can we have the methods in this class take in a HoodieSchema?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to