mridulm commented on pull request #33869:
URL: https://github.com/apache/spark/pull/33869#issuecomment-915428381
@tgravescs Shutdown would not stop existing threads - all will continue to
run (daemon and non-daemon 'regular' threads and shutdown hooks) - and when all
shutdown hooks are complete, vm shuts down.
A simple test to validate:
```
public class Test {
public static void main(String[] args) throws Exception {
Runtime.getRuntime().addShutdownHook(new Thread(new Runnable() {
public void run() {
System.out.println("From shutdown hook");
for (int i = 0;i < 10; i ++) {
try { Thread.sleep(1000); } catch (Exception ex) {}
System.out.println("from hook ... i = " + i);
}
}
}));
Thread th = new Thread(new Runnable() {
public void run() {
System.out.println("From busy thread");
int i = 0;
while (true) {
i += 1;
try { Thread.sleep(500); } catch (Exception ex) {}
System.out.println("from busy ... i = " + i);
}
}
});
th.setDaemon(true);
th.start();
int mainCount = 0;
while(true) {
System.out.println("Main thread ... " + mainCount);
mainCount += 1;
Thread.sleep(1000);
}
}
}
```
(Ctrl + C to initiate shutdown to expose messages continuing from all three
threads until shutdown is complete).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]