Sounds good.

Should we add another paragraph after this paragraph in configuration.md to
explain executor env as well? I will be happy to upload a simple patch.

Note: When running Spark on YARN in cluster mode, environment variables
> need to be set using the spark.yarn.appMasterEnv.[EnvironmentVariableName]
>  property in your conf/spark-defaults.conf file. Environment variables
> that are set in spark-env.sh will not be reflected in the YARN
> Application Master process in clustermode. See the YARN-related Spark
> Properties
> <https://github.com/apache/spark/blob/master/docs/running-on-yarn.html#spark-properties>
>  for
> more information.


Something like:

Note: When running Spark on YARN, environment variables for the executors
need to be set using the spark.yarn.executorEnv.[EnvironmentVariableName]
property in your conf/spark-defaults.conf file or on the command line.
Environment variables that are set in spark-env.sh will not be reflected in
the executor process.



On Wed, Jan 3, 2018 at 7:53 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> Because spark-env.sh is something that makes sense only on the gateway
> machine (where the app is being submitted from).
>
> On Wed, Jan 3, 2018 at 6:46 PM, John Zhuge <john.zh...@gmail.com> wrote:
> > Thanks Jacek and Marcelo!
> >
> > Any reason it is not sourced? Any security consideration?
> >
> >
> > On Wed, Jan 3, 2018 at 9:59 AM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
> >>
> >> On Tue, Jan 2, 2018 at 10:57 PM, John Zhuge <jzh...@apache.org> wrote:
> >> > I am running Spark 2.0.0 and 2.1.1 on YARN in a Hadoop 2.7.3 cluster.
> Is
> >> > spark-env.sh sourced when starting the Spark AM container or the
> >> > executor
> >> > container?
> >>
> >> No, it's not.
> >>
> >> --
> >> Marcelo
> >
> >
> >
> >
> > --
> > John
>
>
>
> --
> Marcelo
>



-- 
John

Reply via email to