Here is an update to my question:
=====================

Tathagata Das <t...@databricks.com>
Jun 9
to me

Event time is part of windowed aggregation API. See my slides -
https://www.slideshare.net/mobile/databricks/a-deep-dive-into-structured-streaming

Let me know if it helps you to find it. Keeping it short as I am on the
mobile.
==================

Chang Lim <chang...@gmail.com>
Jun 9
to Tathagata 
Hi TD,

Thanks for the reply.  But I was thinking of "sorting the events by logical
time" - more like what yesterday, the Microsoft presenter introduced
"reorder" in her slide.

The "group by" is aggregation but does not help in processing events based
on event time ordering.
=================
Tathagata Das
Jun 9
to me 
Aah that's something still out of scope right now.
================

Chang Lim <chang...@gmail.com>
Jun 9

to Tathagata 
Wonder if we can get Microsoft to contribute "reorder" back to Spark :)

Thanks for your excellent work in Spark.

===================
Michael Armbrust <mich...@databricks.com>
Jun 10
to user, me 
There is no special setting for event time (though we will be adding one for
setting a watermark in 2.1 to allow us to reduce the amount of state that
needs to be kept around).  Just window/groupBy on the on the column that is
your event time.


Chang Lim <chang...@gmail.com>
Jun 10

to Michael, user 
Yes, now I realized that.  I did exchanged emails with TD on this topic. 
The Microsoft presentation at Spark summit ("reorder" function) would be a
good addition to Spark.  Would this feature be on the road map?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-2-0-Streaming-and-Event-Time-tp27120p27232.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to