yihua commented on code in PR #11947:
URL: https://github.com/apache/hudi/pull/11947#discussion_r1804270406


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/IncrementalRelation.scala:
##########
@@ -90,24 +78,24 @@ class IncrementalRelation(val sqlContext: SQLContext,
     throw new HoodieException("Incremental queries are not supported when meta 
fields are disabled")
   }
 
+  private val queryContext: IncrementalQueryAnalyzer.QueryContext =
+    IncrementalQueryAnalyzer.builder()
+      .metaClient(metaClient)
+      .startTime(optParams(DataSourceReadOptions.BEGIN_INSTANTTIME.key))
+      .endTime(optParams.getOrElse(DataSourceReadOptions.END_INSTANTTIME.key, 
null))
+      .rangeType(InstantRange.RangeType.OPEN_CLOSED)

Review Comment:
   I've fixed that to be `CLOSED_CLOSED` in my latter git commits.  The Hudi 
incremental source or stream source first determines the range of instants 
though the incremental query analyzer with either `OPEN_CLOSED` range for 
resuming from the last checkpoint, or `CLOSED_CLOSED` range for the first batch 
without checkpoint.  Then the begin and end completion is passed down to the 
relation with both as inclusive.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to