Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/7610#issuecomment-124240067
> As to why --conf spark.app.name doesn't work in pyspark, if you look at
context.py, line #135
Hmm, it may not work, but I don't think that's the cause. With your
changes, that line should never be reached when starting the shell. What I
think is happening is:
- SparkSubmit sets the app name in SparkSubmit.scala, L470 (processing the
option defined in L405).
- In SparkSubmit.scala L555, the conf is read but at that point
`spark.app.name` is already set, so it's not overwritten.
So it seems like an ordering issue in SparkSubmit.scala. In any case, it
doesn't seem important enough to change just for this particular edge case. The
change LGTM.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]