Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16304#discussion_r93353338
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamExecution.scala
 ---
    @@ -387,7 +387,7 @@ class StreamExecution(
             lastExecution.executedPlan.collect {
               case e: EventTimeWatermarkExec if e.eventTimeStats.value.count > 
0 =>
                 logDebug(s"Observed event time stats: 
${e.eventTimeStats.value}")
    -            e.eventTimeStats.value.max - e.delay.milliseconds
    +            math.max(0, e.eventTimeStats.value.max - e.delayMs)
    --- End diff --
    
    i think a lot of things are going to break for that usecase. I dont think 
our sql functions, Java time format, etc are even designed to handle negative 
millis. The way I found this issue is that when i tried for convert -ve 
watermark to formatted string, it gave a very weird date. So I dont think we 
should add complexity for that use case. 
    @marmbrus any thoughts?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to