squito commented on issue #26674: [SPARK-30059][CORE]Stop AsyncEventQueue when 
interrupted in dispatch
URL: https://github.com/apache/spark/pull/26674#issuecomment-559078316
 
 
   >> You would be able to stop the running job if there were only one job, but 
what about with concurrent jobs?
   
   > I didn't get the point. What would happen if just stopping event log queue 
when concurrent jobs running. Could you explain this in detail?
   
   sorry, please ignore that -- I misread your earlier comments, I had thought 
you were discussing stopping running jobs.
   
   > AFAIK, the interruption issue only appears in the event log queue, 
however, it seems that the current approach can't cover all the cases, e.g. 
interrupted in queue.take().
   I came up with an idea that wrapping EventLoggingListener#logEvent in an 
isolated thread and handle InterruptedException in that thread, thus 
AsyncEventQueue thread wouldn't be affected.
   
   yes good point.  I'd need to walk through this very carefully but that 
sounds reasonable to me.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to