Hi, friend:

   I set env variables in conf/spark-defaults.conf(not spark-default.conf) and 
use bin/spark-submit shell to submit spark application. It effect.

  In addition, you can use bin/spark-submit with param --properties-file(FILE 
Path to a file from which to load extra properties. If not specified, this will 
look for conf/spark-defaults.conf).

  @org.apache.spark.deploy.SparkSubmitArguments : loadDefaults
   // Use common defaults file, if not specified by user 
  if (propertiesFile == null) { 
     sys.env.get("SPARK_HOME").foreach { sparkHome => 
       val sep = File.separator 
       val defaultPath = s"${sparkHome}${sep}conf${sep}spark-defaults.conf" 
       val file = new File(defaultPath) 
       if (file.exists()) { 
        propertiesFile = file.getAbsolutePath 
       } 
  } 

  @org.apache.spark.deploy.SparkSubmit : createLaunchEnv
  // Read from default spark properties, if any 
  for ((k, v) <- args.getDefaultSparkProperties) { 
      if (!sysProps.contains(k)) sysProps(k) = v 
  }

  @org.apache.spark.deploy.SparkSubmit : launch  (You can also use this method 
to set sys env before you init a SparkContext)
  for ((key, value) <- sysProps) { 
     System.setProperty(key, value) 
  }

Best Regards


Zhanfeng Huo
 
From: Akhil Das
Date: 2014-08-21 14:36
To: Darin McBeath
CC: Spark User Group
Subject: Re: How to pass env variables from master to executors within 
spark-shell
One approach would be to set these environment variables inside the 
spark-env.sh in all workers then you can access them using the 
System.getEnv("WHATEVER")

Thanks
Best Regards


On Wed, Aug 20, 2014 at 9:49 PM, Darin McBeath <ddmcbe...@yahoo.com.invalid> 
wrote:
Can't seem to figure this out.  I've tried several different approaches without 
success. For example, I've tried setting spark.executor.extraJavaOptions in the 
spark-default.conf (prior to starting the spark-shell) but this seems to have 
no effect.

Outside of spark-shell (within a java application I wrote), I successfully do 
the following:

// Set environment variables for the executors
conf.setExecutorEnv("AWS_ACCESS_KEY_ID", System.getenv("AWS_ACCESS_KEY_ID"));
conf.setExecutorEnv("AWS_SECRET_ACCESS_KEY", 
System.getenv("AWS_SECRET_ACCESS_KEY"));


But, because my SparkContext already exists within spark-shell, this really 
isn't an option (unless I'm missing something).  

Thanks.

Darin.



Reply via email to