Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/6360#discussion_r31774531
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -490,15 +521,32 @@ private[spark] class Client(
env("SPARK_YARN_USER_ENV") = userEnvs
}
- // if spark.submit.pyArchives is in sparkConf, append pyArchives to
PYTHONPATH
- // that can be passed on to the ApplicationMaster and the executors.
- if (sparkConf.contains("spark.submit.pyArchives")) {
- var pythonPath = sparkConf.get("spark.submit.pyArchives")
- if (env.contains("PYTHONPATH")) {
- pythonPath = Seq(env.get("PYTHONPATH"),
pythonPath).mkString(File.pathSeparator)
+ // If pyFiles contains any .py files, we need to add
LOCALIZED_PYTHON_DIR to the PYTHONPATH
+ // of the container processes too. Add all non-.py files directly to
PYTHONPATH.
+ //
+ // NOTE: the code currently does not handle .py files defined with a
"local:" scheme.
--- End diff --
This is a general problem with pyspark in general (not handling local:
URIs), not particular to YARN...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]