Hi,

How do I increase the driver memory? This are my configs right now:

sed 's/INFO/ERROR/' spark/conf/log4j.properties.template   >
./ephemeral-hdfs/conf/log4j.properties
sed 's/INFO/ERROR/' spark/conf/log4j.properties.template  >
spark/conf/log4j.properties
# Environment variables and Spark properties
export SPARK_WORKER_MEMORY="30g" # Whole memory per worker node indepedent
of application (default: total memory on worker node minus 1 GB)
# SPARK_WORKER_CORES = total number of cores an application can use on a
machine
# SPARK_WORKER_INSTANCES = how many workers per machine? Limit the number of
cores per worker if more than one worker on a machine
export SPARK_JAVA_OPTS=" -Dspark.executor.memory=30g
-Dspark.speculation.quantile=0.5 -Dspark.speculation=true
-Dspark.cores.max=80 -Dspark.akka.frameSize=1000 -Dspark.rdd.compress=true"
#spark.executor.memory = memory taken by spark on a machine
export SPARK_DAEMON_MEMORY="2g"

In the application UI, it says my driver has 295 MB memory. I am trying to
broadcast a variable that is 0.15 gigs and it is throwing OutOfMemory
errors, so I am trying to see if by increasing the driver memory I can fix
this.

Thanks!




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/driver-memory-tp10486.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to