tgravescs commented on PR #42570:
URL: https://github.com/apache/spark/pull/42570#issuecomment-1686328788

   so to clarify the issue here, the original PR that added this 
validityInterval config (https://github.com/apache/spark/pull/8857/files) just 
seems to call the YARN setAttemptFailuresValidityInterval().  So you are saying 
that config works on the YARN side but on the Spark side we think its the last 
attempt when it really isn't. That makes sense. 
   
   I'm not sure I understand how your second point above though (Spark thinks 
the application will retry but YARN thinks it was the last attempt.) happens 
with this config?  what is the scenario there?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to