Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/6671#discussion_r32330676
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -460,7 +460,9 @@ private[deploy] class SparkSubmitArguments(args:
Seq[String], env: Map[String, S
| on one of the worker machines
inside the cluster ("cluster")
| (Default: client).
| --class CLASS_NAME Your application's main class (for
Java / Scala apps).
- | --name NAME A name of your application.
+ | --name NAME A name of your application. In
yarn-cluster mode the name
--- End diff --
Hi @ehnalis,
I don't really have a strong opinion about whether `setAppName` should
override the command line or not, other than it's a change in semantics of how
it currently works.
I can see how it may lead to slight confusion since cluster-mode apps will
have slightly different behaviour, but the user can always do the same thing by
just doing `SparkConf.set("spark.app.name", "foo")`, unless you add logic to
SparkConf to not allow that key to be overridden once it's set.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]