[ 
https://issues.apache.org/jira/browse/SPARK-10781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hieu Tri Huynh updated SPARK-10781:
-----------------------------------
    Attachment: SPARK_10781_Proposed_Solution.pdf

> Allow certain number of failed tasks and allow job to succeed
> -------------------------------------------------------------
>
>                 Key: SPARK-10781
>                 URL: https://issues.apache.org/jira/browse/SPARK-10781
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.5.0
>            Reporter: Thomas Graves
>            Priority: Major
>         Attachments: SPARK_10781_Proposed_Solution.pdf
>
>
> MapReduce has this config mapreduce.map.failures.maxpercent and 
> mapreduce.reduce.failures.maxpercent which allows for a certain percent of 
> tasks to fail but the job to still succeed.  
> This could be a useful feature in Spark also if a job doesn't need all the 
> tasks to be successful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to