Hi Nirmal,

+1 for showing last execution time against each spark script which ran
successfully. But why we need the stat of average execution times ? for the
administrators need to know is whether the specific spark script has ran
successfully . By showing the last execution time , he/she can determine
whether it ran as per the specific cron interval.

Regards,
Damith

On Mon, Jan 22, 2018 at 3:25 PM, Nirmal Fernando <[email protected]> wrote:

> Hi All,
>
> Currently, there's no way to see the last execution time against scheduled
> spark scripts of DAS. I think it'll be a useful feature for DAS
> administrators. Wdyt?
>
> Further, if we can calculate the average execution times of a spark script
> during last hour, past day that'll be useful as well. Since it's an average
> calculation, there won't be a big overhead.
>
> --
>
> Thanks & regards,
> Nirmal
>
> Technical Lead, WSO2 Inc.
> Mobile: +94715779733 <+94%2071%20577%209733>
> Blog: http://nirmalfdo.blogspot.com/
>
>
>


-- 
Senior Software Engineer
WSO2 Inc.; http://wso2.com
<http://www.google.com/url?q=http%3A%2F%2Fwso2.com&sa=D&sntz=1&usg=AFQjCNEZvyc0uMD1HhBaEGCBxs6e9fBObg>
lean.enterprise.middleware

mobile: *+94728671315*
_______________________________________________
Architecture mailing list
[email protected]
https://mail.wso2.org/cgi-bin/mailman/listinfo/architecture

Reply via email to