Hey all.

I've been getting OutOfMemory Java errors.
13/09/20 18:54:37 ERROR actor.ActorSystemImpl: Uncaught error from thread
[spark-akka.actor.default-dispatcher-5] shutting down JVM since
'akka.jvm-exit-on-fatal-error' is enabled
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2271)


In my spark_env.sh I have:
 export SPARK_MEM=6154m
 I also added  SPARK_JAVA_OPTS+=  " -Xss1m -Xmx1g"

but in the spark REPL:
  scala> java.lang.Runtime.getRuntime.maxMemory / (1024.0*1024*1024)
res4: Double = 0.47918701171875

Which is neither 1g nor 6g?


My question is this:
What does SPARK_MEM configure?
Is -Xss -Xms -Xmx obeyed?
What does spark.executor.memory configure?

Are there anymore knobs that control how other parts of the system uses
memory? Docs? OR Noob docs etc?

Many thanks.
Shay

Reply via email to