Hi All,

I get an AnalysisException when I run the following query

spark.sql(select current_timestamp() as tsp, count(*) from table group
by window(tsp, '5 minutes'))

I just want create a processing time columns and want to run some simple
stateful query like above. I understand current_timestamp is non
deterministic and if so how can I add a processing time column and use
group by to do stateful aggregation?

Reply via email to