Hi,
I am running a stand alone Spark cluster, 2 workers each has 2 cores.
I submit one Spakr application to the cluster, and I monitor the execution
process via UI ( both worker-ip:8081 and master-ip:4040)
There I can see that the application is handled by many Executors, in my
case one worker
An application can have only one executor at each machine or container
(YARN).
How many thread that each executor have is determined by the parameter
executor-cores.
There are also other parameter setting method that you can specify total-
executor-cores and each executor cores will be determined