Github user pwendell commented on a diff in the pull request: https://github.com/apache/spark/pull/33#discussion_r10142402 --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala --- @@ -135,6 +135,8 @@ class SparkContext( val isLocal = (master == "local" || master.startsWith("local[")) + if (master == "yarn-client") System.setProperty("SPARK_YARN_MODE", "true") --- End diff -- Wondering - is there any reason not to make SPARK_YARN_MODE into a SparkConf value. E.g. spark.yarnMode? It seems like we only use it in places where spark conf is visible. Look, for instance at spark.driver.host and so on where we detect it at the driver and then set it to be passed to executors. It would be nice to try and standardize on SparkConf as the way of doing this unless there is some reason it won't work (which there may be...).
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---