I'm sorry. I realized the formatting was lost in the previous email. To avoid 
the confusion, please find the ask again with correct formatting for table 
schema.


I have 2 tables where my date information contains hours and minutes along with 
the date. Sample schema of 2 tables are as follows -



Table 1 -

[cid:[email protected]]



Table 2 –

[cid:[email protected]]





Multiple records is written in both these tables every minute.

My ask is to compute average of sales column for every 30 minutes worth of 
data. (Average of data from hh:01 mins to hh: 30 mins and hh:31 mins to (hh+1): 
00 mins) I am unable to think of a solution to do it with Griffin. Is there a 
way we can model this scenario?



Thanks a lot for your help in advance.



Regards,

Vikram









-----Original Message-----
From: Vikram Jain <[email protected]>
Sent: Friday, September 27, 2019 11:20 AM
To: [email protected]
Subject: Aggregating data every 30 minutes in a profiling job



WARNING: This email originated outside of Enquero. DO NOT CLICK links or 
attachments unless you recognize the sender and know the content is safe.



Hello All,

I have 2 tables where my date information contains hours and minutes along with 
the date. Sample schema of 2 tables are as follows -



Table 1-

Day

Hour

Minute

Sales

yyyy-MM-dd

HH

MM

10001





Table 2 -

Day

Sales

yyyy-MM-dd : hh.mm

10001





Multiple records is written in both these tables every minute.

My ask is to compute average of sales column for every 30 minutes worth of 
data. (Average of data from hh:01 mins to hh: 30 mins and hh:31 mins to (hh+1): 
00 mins) I am unable to think of a solution to do it with Griffin. Is there a 
way we can model this scenario?



Thanks a lot for your help in advance.



Regards,

Vikram

Reply via email to