Thanks for the tip.

http://localhost:4040/executors/ is showing 
Executors(1)
Memory: 0.0 B used (294.9 MB Total)
Disk: 0.0 B Used

However, running as standalone cluster does resolve the problem.
I can see a worker process running w/ the allocated memory.

My conclusion (I may be wrong) is for 'local' mode the 'executor-memory'
parameter is not honored.

Thanks again for the help!






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Memory-under-utilization-tp14396p14409.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to