Github user renozhang commented on a diff in the pull request:
https://github.com/apache/spark/pull/13836#discussion_r68178303
--- Diff: python/pyspark/context.py ---
@@ -156,7 +156,7 @@ def _do_init(self, master, appName, sparkHome, pyFiles,
environment, batchSize,
self.sparkHome = self._conf.get("spark.home", None)
# Let YARN know it's a pyspark app, so it distributes needed
libraries.
- if self.master == "yarn-client":
+ if self.master == "yarn":
--- End diff --
Thanks for quick reply.
I think my problem is building the whole project with -Pscala-2.10. I'm not
sure if I missed some doc for building spark 2.x with scala-2.10, or spark 2.x
is not compatible with scala-2.10?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]