Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/15009#discussion_r101083260
--- Diff:
launcher/src/main/java/org/apache/spark/launcher/SparkLauncher.java ---
@@ -107,6 +121,30 @@ public static void setConfig(String name, String
value) {
launcherConfig.put(name, value);
}
+ /**
+ * Specifies that Spark Application be stopped if current process goes
away.
+ * It tries stop/kill Spark Application if {@link LauncherServer} goes
away.
+ *
+ * @since 2.2.0
+ * @return This launcher.
+ */
+ public SparkLauncher stopIfLauncherShutdown() {
--- End diff --
I think this comes down to consistency with the api. Generally we try to
support things in all modes, even though we don't always succeed, like in this
case threads it works in yarn cluster mode and not client mode.
why wouldn't stop on shutdown work for child processes? I think he said he
had tested that case and as long as it does I don't see why not leaving it to
keep things consistent.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]