Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/8832#issuecomment-142158322
> if the master does not contain deploy mode information, the deploy mode
is "client",
else get deploy mode from the master URL.
We're actually deprecating the master URLs `yarn-client` and `yarn-cluster`
(#8385). In general I find it pretty confusing to have the deploy mode embedded
in the master URL.
> I think this issue is not related to SparkR only. @andrewor14, any policy
when handling default configurations when a spark-application is not launched
via spark-submit? Should we extract some logic from spark-submit so that the
logic can be called in non-spark-submit cases to keep consistency?
We do have the launcher library, but beyond that I don't think this is
something we support because it gets pretty difficult to maintain. AFAIK deploy
mode is the only such config so this should be fine.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]