Ottomata has uploaded a new change for review.

  https://gerrit.wikimedia.org/r/191691

Change subject: Use spark.yarn.jar instead of SPARK_JAR env var, SPARK_JAR is 
deprecated
......................................................................

Use spark.yarn.jar instead of SPARK_JAR env var, SPARK_JAR is deprecated

Change-Id: Iafd7d84b0135470f0dc837d63c61ee18a887c105
---
M manifests/spark.pp
A templates/spark/spark-defaults.conf.erb
M templates/spark/spark-env.sh.erb
3 files changed, 25 insertions(+), 7 deletions(-)


  git pull ssh://gerrit.wikimedia.org:29418/operations/puppet/cdh 
refs/changes/91/191691/1

diff --git a/manifests/spark.pp b/manifests/spark.pp
index 0ba7814..448b323 100644
--- a/manifests/spark.pp
+++ b/manifests/spark.pp
@@ -73,6 +73,12 @@
         require => Exec['spark_assembly_jar_install'],
     }
 
+    file { "${config_directory}/spark-defaults.conf":
+        content => template('cdh/spark/spark-defaults.conf.erb'),
+        require => Exec['spark_assembly_jar_install'],
+    }
+
+
     file { "${config_directory}/log4j.properties":
         source => 'puppet:///modules/cdh/spark/log4j.properties',
     }
diff --git a/templates/spark/spark-defaults.conf.erb 
b/templates/spark/spark-defaults.conf.erb
new file mode 100644
index 0000000..2122ded
--- /dev/null
+++ b/templates/spark/spark-defaults.conf.erb
@@ -0,0 +1,19 @@
+# Note: This file is managed by Puppet.
+
+# Default system properties included when running spark-submit.
+# This is useful for setting default environmental settings.
+
+# Example:
+# spark.master                     spark://master:7077
+# spark.eventLog.enabled           true
+# spark.eventLog.dir               hdfs://namenode:8021/directory
+# spark.serializer                 org.apache.spark.serializer.KryoSerializer
+# spark.driver.memory              5g
+# spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value 
-Dnumbers="one two three"
+
+# Set spark.yarn.jar to the spark-assembly.jar in HDFS.  This makes it so
+# that the spark jar doesn't have to be uploaded to HDFS every time
+# a user submits a job.
+# See: 
http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_ig_running_spark_apps.html
+# If you upgrade spark, be sure to upload the new spark-assembly.jar to this 
HDFS path.
+spark.yarn.jar                    <%= @spark_jar_hdfs_path %>
diff --git a/templates/spark/spark-env.sh.erb b/templates/spark/spark-env.sh.erb
index 31d2eec..e851e4d 100755
--- a/templates/spark/spark-env.sh.erb
+++ b/templates/spark/spark-env.sh.erb
@@ -86,10 +86,3 @@
 ### you want to run with scala version, that is included with the package
 #export SCALA_HOME=${SCALA_HOME:-/usr/lib/spark/scala}
 #export PATH=$PATH:$SCALA_HOME/bin
-
-# Set SPARK_JAR to the spark-assembly.jar in HDFS.  This makes it so
-# that the spark jar doesn't have to be uploaded to HDFS every time
-# a user submits a job.
-# See: 
http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_ig_running_spark_apps.html
-# If you upgrade spark, be sure to upload the new spark-assembly.jar to this 
HDFS path.
-export SPARK_JAR=<%= @spark_jar_hdfs_path %>

-- 
To view, visit https://gerrit.wikimedia.org/r/191691
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: Iafd7d84b0135470f0dc837d63c61ee18a887c105
Gerrit-PatchSet: 1
Gerrit-Project: operations/puppet/cdh
Gerrit-Branch: master
Gerrit-Owner: Ottomata <[email protected]>

_______________________________________________
MediaWiki-commits mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits

Reply via email to