Re: Time window on Processing Time

2017-08-30 Thread madhu phatak
Hi,
That's great. Thanks a lot.

On Wed, Aug 30, 2017 at 10:44 AM, Tathagata Das <tathagata.das1...@gmail.com
> wrote:

> Yes, it can be! There is a sql function called current_timestamp() which
> is self-explanatory. So I believe you should be able to do something like
>
> import org.apache.spark.sql.functions._
>
> ds.withColumn("processingTime", current_timestamp())
>   .groupBy(window("processingTime", "1 minute"))
>   .count()
>
>
> On Mon, Aug 28, 2017 at 5:46 AM, madhu phatak <phatak@gmail.com>
> wrote:
>
>> Hi,
>> As I am playing with structured streaming, I observed that window
>> function always requires a time column in input data.So that means it's
>> event time.
>>
>> Is it possible to old spark streaming style window function based on
>> processing time. I don't see any documentation on the same.
>>
>> --
>> Regards,
>> Madhukara Phatak
>> http://datamantra.io/
>>
>
>


-- 
Regards,
Madhukara Phatak
http://datamantra.io/


Re: Time window on Processing Time

2017-08-29 Thread Tathagata Das
Yes, it can be! There is a sql function called current_timestamp() which is
self-explanatory. So I believe you should be able to do something like

import org.apache.spark.sql.functions._

ds.withColumn("processingTime", current_timestamp())
  .groupBy(window("processingTime", "1 minute"))
  .count()


On Mon, Aug 28, 2017 at 5:46 AM, madhu phatak <phatak@gmail.com> wrote:

> Hi,
> As I am playing with structured streaming, I observed that window function
> always requires a time column in input data.So that means it's event time.
>
> Is it possible to old spark streaming style window function based on
> processing time. I don't see any documentation on the same.
>
> --
> Regards,
> Madhukara Phatak
> http://datamantra.io/
>


Time window on Processing Time

2017-08-28 Thread madhu phatak
Hi,
As I am playing with structured streaming, I observed that window function
always requires a time column in input data.So that means it's event time.

Is it possible to old spark streaming style window function based on
processing time. I don't see any documentation on the same.

-- 
Regards,
Madhukara Phatak
http://datamantra.io/