Burak Yavuz commented on SPARK-21590:

[~KevinZwx] there's nothing wrong. It works as it needs to right?

Here's how I feel you should look at this problem. You're dealing with 
timezones, therefore the better solution would be to set your timezone rather 
than setting your window to start at an -8 hour offset. You should rather use:
from_utc_timestamp('timestamp, "CST")
to adjust your timestamps and continue to use window without an offset.

The start offset is generally meant to be used when you for example want 30 
minute intervals at a 15 minute offset, to make it 10:45-11:15 and 11:15-11:45. 
Not to adjust for timezones.

Either way, using +16 works, because:

Let's say something happened on 2017-03-14 17:30:00 CST. That corresponds to 
2017-03-14 09:30:00 UTC. If you set a +16 start offset, this time falls into:
start: 2017-03-13T16:00:00.000+0000
 end: 2017-03-14T16:00:00.000+0000
which is actually 2017-03-14 00:00 CST - 2017-03-15 00:00 CST and is what you 
want, no?

You actually need 

> Structured Streaming window start time should support negative values to 
> adjust time zone
> -----------------------------------------------------------------------------------------
>                 Key: SPARK-21590
>                 URL: https://issues.apache.org/jira/browse/SPARK-21590
>             Project: Spark
>          Issue Type: Bug
>          Components: Structured Streaming
>    Affects Versions: 2.0.0, 2.0.1, 2.1.0, 2.2.0
>         Environment: spark 2.2.0
>            Reporter: Kevin Zhang
>              Labels: spark-sql, spark2.2, streaming, structured, timezone, 
> window
> I want to calculate (unique) daily access count using structured streaming 
> (2.2.0). 
> Now strut streaming' s window with 1 day duration starts at 
> 00:00:00 UTC and ends at 23:59:59 UTC each day, but my local timezone is CST 
> (UTC + 8 hours) and I
> want date boundaries to be 00:00:00 CST (that is 00:00:00 UTC - 8). 
> In Flink I can set the window offset to -8 hours to make it, but here in 
> struct streaming if I set the start time (same as the offset in Flink) to -8 
> or any other negative values, I will get the following error:
> {code:java}
> Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot 
> resolve 'timewindow(timestamp, 86400000000, 86400000000, -28800000000)' due 
> to data type mismatch: The start time (-28800000000) must be greater than or 
> equal to 0.;;
> {code}
> because the time window checks the input parameters to guarantee each value 
> is greater than or equal to 0.
> So I'm thinking about whether we can remove the limit that the start time 
> cannot be negative?

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to