Yes, executors run one task per core of your machine by default. You can also 
manually launch them with more worker threads than you have cores. What cluster 
manager are you on?

Matei

On August 29, 2014 at 11:24:33 AM, Victor Tso-Guillen (v...@paxata.com) wrote:

I'm thinking of local mode where multiple virtual executors occupy the same vm. 
Can we have the same configuration in spark standalone cluster mode?

Reply via email to