wangshuo128 commented on issue #26674: [SPARK-30059][CORE]Stop AsyncEventQueue 
when interrupted in dispatch
URL: https://github.com/apache/spark/pull/26674#issuecomment-558961981
 
 
   Thanks for your reply. @squito 
   
   > You would be able to stop the running job if there were only one job, but 
what about with concurrent jobs? 
   
   I didn't get the point. What would happen if just stopping event log queue 
when concurrent jobs running. Could you explain this in detail?
   
   >  I wonder if we should just have some special case handling in the 
EventLoggingListener to retry once after interrupt?
   
   I agree with this. 
   AFAIK, the interruption issue only appears in the event log queue, however, 
it seems that the current approach can't cover all the cases, e.g. interrupted 
in queue.take().
   I came up with an idea that wrapping `EventLoggingListener#logEvent` in an 
isolated thread and handle InterruptedException in that thread, thus  
`AsyncEventQueue`  thread wouldn't be affected.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to