Hi,

I am trying to use Spark for my own applications, and I am currently
profiling the performance with local mode, and I have a couple of questions:

1. When I set spark.master local[N], it means the will use up to N worker
*threads* on the single machine. Is this equivalent to say there are N
worker *nodes*  as described in
http://spark.apache.org/docs/latest/cluster-overview.html
(So each worker node/thread are viewed separately and can have its own
executor for each application)

2. Is there anyway to set up the max memory used by each worker thread/node?
I only find we can set the memory for each executor? (spark.executor.mem)

Thank you!





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to