Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/7610#issuecomment-124149833
> Note that --conf spark.app.name in command-line has no effect in
spark-shell and pyspark. Instead, --name must be used.
This is not critical, but not great either; it points at some inconsistency
in the code somewhere. From looking at SparkSubmit.scala, it should work, but I
haven't explicitly tested it.
> spark-sql which doesn't accept --name
Is that true? `SparkSQLEnv` has code to explicitly handle app names set by
the user. It that doesn't work, it sounds like a bug.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]