Re: Given events with start and end times, how to count the number of simultaneous events using Spark?

2018-09-26 Thread kathleen li
You can use Spark sql window  function , something like
df.createOrReplaceTempView(“dfv”)
Select count(eventid) over ( partition by start_time, end_time orderly 
start_time) from  dfv

Sent from my iPhone

> On Sep 26, 2018, at 11:32 AM, Debajyoti Roy  wrote:
> 
> The problem statement and an approach to solve it using windows is described 
> here:
> 
> https://stackoverflow.com/questions/52509498/given-events-with-start-and-end-times-how-to-count-the-number-of-simultaneous-e
> 
> Looking for more elegant/performant solutions, if they exist. TIA !


Given events with start and end times, how to count the number of simultaneous events using Spark?

2018-09-26 Thread Debajyoti Roy
The problem statement and an approach to solve it using windows is
described here:

https://stackoverflow.com/questions/52509498/given-events-with-start-and-end-times-how-to-count-the-number-of-simultaneous-e

Looking for more elegant/performant solutions, if they exist. TIA !