[ 
https://issues.apache.org/jira/browse/SPARK-14049?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15205287#comment-15205287
 ] 

Apache Spark commented on SPARK-14049:
--------------------------------------

User 'paragpc' has created a pull request for this issue:
https://github.com/apache/spark/pull/11867

> Add functionality in spark history sever API to query applications by end 
> time 
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-14049
>                 URL: https://issues.apache.org/jira/browse/SPARK-14049
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.6.1, 2.0.0
>            Reporter: Parag Chaudhari
>
> Currently, spark history server provides functionality to query applications 
> by application start time range based on minDate and maxDate query 
> parameters, but it  lacks support to query applications by their end time. In 
> this Jira we are proposing optional minEndDate and maxEndDate query 
> parameters and filtering capability based on these parameters to spark 
> history server. This functionality can be used for following queries,
> 1. Applications finished in last 'x' minutes
> 2. Applications finished before 'y' time
> 3. Applications finished between 'x' time to 'y' time
> 4. Applications started from 'x' time and finished before 'y' time.
> For backward compatibility, we can keep existing minDate and maxDate query 
> parameters as they are and they can continue support filtering based on start 
> time range.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to