Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/7253#issuecomment-121066858
  
    > Did you consider all possibilities?speculative execution??
    
    There's no such thing as speculative execution of app attempts.
    
    >  if a1 start at same time as a2 (if not later by definition of attempts)
    
    That should never happen (see above, attempts are only started after a 
previous attempt has finished).
    
    > Especially as that is the case @tgravescs argues in his SPARK-8593 
    
    No it's not. The case he described is when the first attempt finished 
executing (i.e. there's no live SparkContext anymore) but the *log file* where 
the attempt was writing its logs wasn't properly closed. There are never two 
attempts executing at the same time, and it's a safe assumption that if there 
is more than one attempt, all attempts except the most recent one (i.e. the one 
with the most recent start time) are not running.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to