gurmeetsaran-ant commented on PR #15345:
URL: https://github.com/apache/iceberg/pull/15345#issuecomment-3918886888

   > @gurmeetsaran-ant If the logical type in avro is timestamp-milliseconds 
then it maps to iceberg TimestampType and in that case convertLong will not be 
called rather
   > 
   > 
https://github.com/apache/iceberg/blob/68b7a2a71056aafd982b16e45e7f7ec254802e4a/kafka-connect/kafka-connect/src/main/java/org/apache/iceberg/connect/data/RecordConverter.java#L460
   > 
   > will be called which handles the java.util.Date.
   
   Thanks for the review @kumarpritam863 . You're correct that when the Kafka 
Connect schema has Timestamp logical type, toIcebergType() maps to 
TimestampType and convertTimestampValue() is called, which already handles 
java.util.Date.
   
     However, in my setup the sink uses value.converter.schemas.enable=false 
with the Confluent Avro converter. This means record.valueSchema() is null on 
the sink side 
   
https://github.com/apache/iceberg/blob/d06e6c22fecf00be81f2eb399f090fae320560a4/kafka-connect/kafka-connect/src/main/java/org/apache/iceberg/connect/data/IcebergWriterFactory.java#L86-L88
   
     With valueSchema() == null, the Iceberg sink takes the schemaless code 
path and autoCreateTable() uses inferIcebergType(). The toIcebergType() -> 
TimestampType -> convertTimestampValue() path is never reached in this 
configuration. Instead, the type inference and value conversion depend on the 
Java runtime types in the Map, and convertLong() ends up receiving a 
java.util.Date.
   
     **Relevant sink config:**
   ```
     value.converter=io.confluent.connect.avro.AvroConverter
     value.converter.schemas.enable=false
     iceberg.tables.auto-create-enabled=true
     iceberg.tables.evolve-schema-enabled=true
   ```
   
     **Avro schema in Schema Registry for the field:**
   
   
`{"type":"long","connect.version":1,"connect.name":"org.apache.kafka.connect.data.Timestamp","logicalType":"timestamp-millis"}`
   
     **Stack trace:**
   
     ```
   java.lang.IllegalArgumentException: Cannot convert to long: java.util.Date
         at 
o.a.i.connect.data.RecordConverter.convertLong(RecordConverter.java:327)
         at 
o.a.i.connect.data.RecordConverter.convertValue(RecordConverter.java:247)
         at 
o.a.i.connect.data.RecordConverter.convertStructValue(RecordConverter.java:227)
         at o.a.i.connect.data.RecordConverter.convert(RecordConverter.java:111)
   ```
   
   The fix also makes convertLong() consistent with convertDateValue(), 
convertTimeValue(), convertOffsetDateTime(), and convertLocalDateTime(), which 
all already handle java.util.Date using the same 
@SuppressWarnings("JavaUtilDate") pattern.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to