Hi,

I am running a stand alone Spark cluster, 2 workers each has 2 cores.
I submit one Spakr application to the cluster, and I monitor the execution
process via UI ( both worker-ip:8081 and master-ip:4040)
There I can see that the application is handled by many Executors, in my
case one worker has 10 executors and the other one only one !!

My question is what is the cardinality relation between executor process
and the submitted Spark application ? I assumed that for each application
there will be one executor process handling the all spark related tasks
(maps, filtering, reduce,...) ?!

best,
/Shahab

Reply via email to