singhpk234 commented on code in PR #5160:
URL: https://github.com/apache/iceberg/pull/5160#discussion_r911616713


##########
spark/v3.3/spark/src/main/java/org/apache/iceberg/spark/source/IcebergSource.java:
##########
@@ -173,7 +174,19 @@ public Optional<String> 
extractTimeTravelVersion(CaseInsensitiveStringMap option
 
   @Override
   public Optional<String> extractTimeTravelTimestamp(CaseInsensitiveStringMap 
options) {
-    return Optional.ofNullable(PropertyUtil.propertyAsString(options, 
"timestampAsOf", null));
+    String timestampAsOf = PropertyUtil.propertyAsString(options, 
"timestampAsOf", null);
+    if (timestampAsOf == null) {
+      return Optional.empty();
+    }
+
+    try {
+      // timestamp provided should be at a seconds precision.
+      // TODO: remove once https://issues.apache.org/jira/browse/SPARK-39633 
is resolved
+      long timestampAsOfAsLong = Long.parseLong(timestampAsOf);
+      return 
Optional.of(DateTimeUtil.formatTimestampMillisWithLocalTime(timestampAsOfAsLong 
* 1000));

Review Comment:
   Makes sense, calling spark api's via iceberg will also not be ok i think 
then as this now indirectly puts the translation to iceberg (as which spark api 
to use). Should I  add a TODO in the code with the issue linking it and add the 
UT added in this PR in ignore as soon as we upgrade to 3.3.1 we will have it 
(The PR for [SPARK-39633](https://issues.apache.org/jira/browse/SPARK-39633) is 
merged in upstream). 
   
   Meanwhile we have existing ways to TT via dataframe options we can specify 
[as-of-timestamp](https://iceberg.apache.org/docs/latest/spark-queries/#time-travel)
 in milliseconds.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to